Sunday, March 1, 2015

Docker Setup for Elasticsearch, Logstash and Kibana

Now that you're already familiar with Docker, we're going to take this opportunity to refine this knowledge by setting up an ELK logging system using Docker.

Prerequisites: Install boot2docker and set up VirtualBox. It helps to be familiar with ELK architecture and the manual steps. This guide was written for Mac users.

Initial Setup

# boot2docker initialization
boot2docker init
boot2docker start

# now run the commands listed in the shell to...
# ...set your ENV variables

# make a mountable directory on the host
boot2docker ssh
sudo mkdir -p /data/log /data/data
sudo chgrp staff -R /data
sudo chmod 775 -R /data

Learning Docker by Setting Up ELK

I encourage you to first go through Docker's excellent documentation. Afterwards, see this article to get an overview of ELK. Also, if you ever need to reference the manual steps for any reason, check out this guide. Also all of the repositories in this guide can be found on my github account.

Fetching the Repos

# create a dockerfiles directory
mkdir ~/dockerfiles
cd ~/dockerfiles

# pull down all of the dockerfiles
git clone https://github.com/roblayton/docker-elasticsearch.git
git clone https://github.com/roblayton/docker-logstash.git
git clone https://github.com/roblayton/docker-logstash-forwarder.git
git clone https://github.com/roblayton/docker-kibana.git

# fetch the ip address of the...
# ...boot2docker vm for future reference
boot2docker ip

Generating the Keys

You'll want to generate a key and certificate for the logstash server container, and then copy the certificate over to the logstash-forwarder.

# generate certificates
# when prompted, enter the boot2docker ip address
cd ~/dockerfiles/docker-logstash/certs
./lc-tlscert

cp ~/dockerfiles/docker-logstash/certs/selfsigned.crt ~/dockerfiles/docker-logstash-forwarder/certs/selfsigned.crt

Elasticsearch

I wouldn't recommend running elasticsearch, mongoDB, or any services with persistence through docker. You're better off setting up Elasticsearch manually. We're only running it through docker, here, for tutorial purposes.

# build the image
cd ~/dockerfiles/docker-elasticsearch
docker build -t elasticsearch .

docker run -d -p 9200:9200 -p 9300:9300 --name elasticsearch -v /data:/data -t elasticsearch

# make sure elasticsearch is working
curl http://<BOOT2DOCKERIP>:9200/_search?pretty

Logstash

# build the image
cd ~/dockerfiles/docker-logstash
docker build -t logstash .

# run and link the container to elasticsearch
docker run -e ES_HOST=<BOOT2DOCKERIP> -e ES_HTTP_PORT=9200 -e ES_PORT=9300 -d -p 5043:5043 -p 9292:9292 --name logstash --link elasticsearch:es -t logstash

To ensure you've set up logstash properly, run the container in interactive mode by with the following command: docker run -e ES_HOST=<BOOT2DOCKERIP> -e ES_HTTP_PORT=9200 -e ES_PORT=9300 -p 5043:5043 -p 9292:9292 --name logstash --link elasticsearch:es -it logstash

Logstash-forwarder

# build the image
cd ~/dockerfiles/docker-logstash-forwarder
docker build -t logstash-forwarder .

# run and link the container to logstash
docker run -e LOGSTASH_SERVER=<BOOT2DOCKERIP>:5043 -d --name logstash-forwarder -v /data:/data --link logstash:logstash -t logstash-forwarder

Note: If you find that your containers don't seem to be working properly, run them in the foreground by removing the -d flag and monitor the stdout. It's also wise to run them outside of supervisor.

Kibana

# build the image
cd ~/dockerfiles/docker-kibana
docker build -t kibana .

# run
docker run -e ES_HOST=<BOOT2DOCKERIP> -e ES_PORT=9200 -d -p 80:80 --name kibana -t kibana

Test That Everything is Working

# ssh into the host machine
boot2docker ssh

# write to the data/log directory
sudo touch /data/log/test.log
echo 'time="2015-02-28T23:00:05Z" level="info" msg="This is a test"' >> /data/log/test.log

Now navigate to kibana in your browser at http://<BOOT2DOCKERIP>. You should see the log entry on the graph at the top of the page as well as the table below.

2 comments: