Managing logs efficiently is crucial for maintaining the health and performance of your applications. AWS provides a robust solution for log forwarding from Linux EC2 instances via Kinesis Agent to AWS Elasticsearch. This post will guide you through the process, ensuring your logs are seamlessly transferred and indexed for real-time analysis.

Prerequisites

Before we begin, ensure you have the following:

  • An AWS account
  • A running Linux EC2 instance
  • AWS CLI installed and configured
  • IAM role with appropriate permissions for Kinesis and Elasticsearch
  • AWS Elasticsearch domain set up

Step 1: Install Kinesis Agent

First, install the Kinesis Agent on your Linux EC2 instance. The Kinesis Agent is a stand-alone Java software application that continuously monitors files and sends new data to Amazon Kinesis.

sudo yum install -y aws-kinesis-agent

For Ubuntu:

sudo apt-get update

sudo apt-get install -y aws-kinesis-agent

Step 2: Configure Kinesis Agent

Next, configure the Kinesis Agent to specify the log files you want to monitor and forward. The configuration file is located at /etc/aws-kinesis/agent.json.

{

  “cloudwatch.emitMetrics”: true,

  “kinesis.endpoint”: “kinesis.us-west-2.amazonaws.com”,

  “firehose.endpoint”: “firehose.us-west-2.amazonaws.com”,

  “flows”: [

    {

      “filePattern”: “/var/log/*.log”,

      “deliveryStream”: “your-delivery-stream-name”

    }

  ]

}

Replace your-delivery-stream-name with the name of your Kinesis Firehose delivery stream.

Step 3: Start the Kinesis Agent

Start the Kinesis Agent to begin forwarding logs.

sudo service aws-kinesis-agent start

Verify that the agent is running:

sudo service aws-kinesis-agent status

Step 4: Set Up Kinesis Firehose

In the AWS Management Console, navigate to Kinesis Firehose and create a new delivery stream. Configure the destination to be your AWS Elasticsearch domain.

  1. Source: Choose Direct PUT or other sources.
  2. Transform: Optional – Configure if you need data transformation.
  3. Destination: Select Amazon Elasticsearch Service and configure the index, type, and rotation.

Step 5: Configure IAM Role

Ensure your EC2 instance has an IAM role with the necessary permissions to interact with Kinesis Firehose and Elasticsearch. Attach the following policies to the role:

{

    “Version”: “2012-10-17”,

    “Statement”: [

        {

            “Effect”: “Allow”,

            “Action”: [

                “firehose:PutRecord”,

                “firehose:PutRecordBatch”

            ],

            “Resource”: “arn:aws:firehose:region:account-id:deliverystream/your-delivery-stream-name”

        },

        {

            “Effect”: “Allow”,

            “Action”: [

                “es:ESHttpPost”,

                “es:ESHttpPut”

            ],

            “Resource”: “arn:aws:es:region:account-id:domain/your-elasticsearch-domain/*”

        }

    ]

}

Replace region, account-id, your-delivery-stream-name, and your-elasticsearch-domain with your actual values.

Step 6: Verify Log Forwarding

Check the AWS Elasticsearch dashboard to verify that logs are being forwarded and indexed correctly. Integrated with Elasticsearch, Kibana can visualize and analyze log data.

Conclusion

Setting up log forwarding from Linux EC2 instances via Kinesis Agent to AWS Elasticsearch provides a robust and scalable log management and analysis solution. Following the above steps, you can ensure your application logs are efficiently forwarded and indexed for real-time insights.