Amazon ElasticSearch for Centralized Log Management
This blog is about using Amazon Elasticsearch with Amazon Cloudwatch. I had a use case where I have to make a setup for centralised logs, which can be done by using Amazon ElasticSearch and CloudWatch services.
Pre-requisites:
New Amazon Elasticsearch Instance can be setup from here.
A new EC2 Instance with an attached role having Cloudwatch full access policy should be launched to install Cloudwatch daemon. Steps to install Cloudwatch deamon are as listed below:
[js]
wget <a href="https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py">https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py</a>
sudo python ./awslogs-agent-setup.py –region us-east-1
[/js]
While executing above steps, a few details has to be provided. These are listed below and also shown in the next screenshot:
- AWS region name
- AWS output format
- Path of log file to upload [/var/log/syslog]
- Destination Log Group name [/var/log/syslog]
- Log Stream name
- Log Event timestamp format
- Initial position of upload
Now, the log streams can be seen on AWS Cloudwatch Console as:
Since, Elasticsearch instance is already up and running in the account, so, now we are going to stream the above syslogs to Amazon Elasticsearch. We need to provide the Elasticsearch cluster name as shown in below screenshot and click next:
Now, Filters can be applied which can be referred from here, and debugging can be done while writing the pattern using “Test Pattern”:
Now, when we click start streaming button, we get success window as shown below. This process takes some time.
Now, we simply need to click on “Kibana 3” or “Kibana4” (as shown in above screenshot) to get desired logs on Kibana UI. This UI is illustrated in below screenshot:
Nice Blog (Y)