In this post we enable more robust monitoring of our Openstack cluster by sending our Openstack logs to an ELK Stack (Elasticsearch Logstash Kibana). It’s a lot easier than you might think. Read on to find out how easy…
This article assumes you already have a running Elastic Stack. If you don’t then go through my post Elk stack in under ten minutes and come back in 11 mins. Openstack is made of many components all of which essentially install the same way and log the same way. For this post we are going to limit our discussion to sending Openstack Keystone logs to our ELK Stack. The other components will work similarly so don’t fret.
We are going to configure Logstash to receive our logs from our Openstack controller node. To do this, we will install a Logstash plugin called Filebeats. Filebeats takes the logs and parses them into fields that Elasticsearch can understand and make it easier for you to search the data.
Run the following commands on your ELK stack server:
# cd /usr/share/logstash/
# bin/logstash-plugin install logstash-input-beats
Create a new file named beats.conf in /etc/logstash/conf.d and add the following contents:
input {
beats {
port => "5043"
}
}
output {
elasticsearch {
hosts => ["192.168.1.133:9200"]
}
stdout { codec => rubydebug }
}
Be sure to replace the IP for hosts with your particular IP for your ELK Stack.
Install the FileBeats client by runng these commands.
$ curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-5.2.0-amd64.deb
$ sudo dpkg -i filebeat-5.2.0-amd64.d
Edit the Filebeats configuration to send the logs to our ELK Stack:
$ sudo vim /etc/filebeat/filebeat.yml
Make sure you have the following configurations enabled:
filebeat.prospectors:
- input_type: log
paths:
- /var/log/keystone/*.log
output.logstash:
hosts: ["192.168.1.133:5043"]
Again make sure you use the correct IP for your ELK Stack. Notice that the input_type paths configuration is set to all the .log files in the /var/log/keystone directory. Simple add the rest of your Openstack Logs in this manner to get all your logs into Elasticsearch.
Start and enable the FileBeats service.
$ sudo systemctl enable filebeat.service
$ sudo systemctl start filebeat.service
Next we need to send the filebeat template to your logstash server. You only have to do this once if you are setting up logging for multiple servers using filebeats.
curl -XPUT 'http://192.168.1.73:9200/_template/filebeat' -d@/etc/filebeat/filebeat.template.json
Make sure to use the correct IP address for your logstash server.
Go back to your Kibana page and you see your logs are available for searching! They should be available under the ‘logstash-’ index. Click on Discover and search for ‘keystone’ and you should see all your keystone logs.
4/19/2017
If your kibana doesn’t find your index yet then give it some time. It all depends on your hardware performance and memory. With my current setup (running it in a 2×4 openstack instance on a Dell 2950 Server) it took it a considerable amount of time to start showing the index. Which caused me a ton of headaches. A quick way to check is using the URL below to have elasticsearch show you what indices it has:
http://[your_es_ip]:9200/_cat/indices
And you can see from the output below that it now has my filebeat-* index:
yellow open filebeat-2017.04.19 skBC5yBRSp-GdkjEC-IirQ 5 1 12527927 0 1.6gb 1.6gb
yellow open filebeat-2017.04.20 n58fbF0fSAqUIZbTzKBkAQ 5 1 2505660 0 330.9mb 330.9mb
yellow open .kibana Xgj3pKAuQxuBkFFJYgboGg 1 1 2 0 12.6kb 12.6kb
See I told you that would be easy. If you liked this post please signup for my mailing list below to get updated every Friday about my latest posts for the week.
Tags: elasticsearch, kibana, logstash, openstack