Redis and the Elastic stack

The first thing that popped into my mind when writing this post’s topic is that it sounds like a band fresh out of the 80’s. Redis and the Elastic stack. 

This week’s mission was to get more of an understanding of how the Elastic Stack (formally known as ELK stack) functions. A part of that was coupling it up with Redis in order to store time series data which in turn would get sent onto Logstash and then to Kibana for graphing purposes.

To get a better feel for this in a development scenario I figured it was only right to set this up in Docker and give it a whirl. In this post I wanted to sum up a few things I learned along the way and also shine some light on some information that was tricky to find else-where online.

If you’re interested in following along and building your own Elastic Stack with Redis you can pull the Docker images for the Elastic Stack here. For Redis I am just using the Official Redis image that you can pull from Docker Hub.

So first thing’s first let’s do a docker pull on the two images we’re going to need being Redis and the Elastic Stack image.


Now that we’ve got our images we’re going to spin up two containers.

Let’s start with our Redis container.

docker run -P -it --name redis redis

It’s important that we give our Redis container a name as we’re going to need that when we create our Elastic Stack container. This comes in handy when we need to link our two containers together.


Now that we’ve got Redis up and running we can hit Ctrl+P+Q to jump out of our interactive Docker session.

Now we’re going to need our Elastic Stack Container up.

docker run -p 5601:5601 -p 9200:9200 -p 5044:5044 -p 5000:5000 -it --link redis sebp/elk

This should go through the start up process of Kibana, Logstash and Elasticsearch. Now that we have everything up and running you’ll be able to access Kibana and Elasticsearch via your web browser.

If everything went okay you should be able to connect to both of these web interfaces.



Elasticsearch JSON interface

We can close the web browsers for now as we need to send some data through to Redis before we can continue any further. Before we do that though we need to make some slight configuration changes in our Elastic Stack container.

Lets start a new shell inside our container like so.

docker exec -it 54 /bin/bash

In this example ’54’ represents the first two digits of my container ID. I am using this to target the container I want to run a new shell in.

After you’re in, make sure your Container is linked correctly to the Redis container.


By taking a look in /etc/hosts we can see down the bottom that we’re linked to our Elastic Stack container (

Now head over to /etc/logstash/conf.d to checkout the configuration files for logstash.


This is where we’re going to add a new file to let Logstash know that it should be pulling information from Redis and passing that along to Elasticsearch. Let’s call our file ‘03-redis.conf’, Since I am more traditionally a Windows guy I am going to use Nano for creating and editing files (Sorry, VI is wack.).


Here we’re just putting in some basic information so that Logstash and Redis are able to communicate. For more information and configuration options for Redis and Logstash see here. Now Save this file and restart the Logstash Service.

Once you restarted the Service, check out the Logstash log as so:


If all went well you should see the “Pipeline main started” message indicating that everything went well and the ‘03-redis.conf’ file has been accepted.

Now that we’ve got Redis hooked up to Logstash we should send some data into Redis to hopefully see it appear on the other-side. There are a few ways to do this but let’s look at a basic way of doing this with Python and the Redis package.

After you install the Redis package from pip, Take this simple example:

import redis
r = redis.StrictRedis(host='localhost', port=32788, db=4)

dict = {
    'Phone' : "iPhone",
    'Computer' : "Dell",
    'Bird' : "Seagull"

for i in dict:
    r.lpush('whatever', { i : dict[i] })

We run this example from our host machine and connect to our Redis container which is running on ‘localhost’ on port 32788 (Random port thanks to -P when we did our docker run).

You can either use the redis-cli to check and make sure your data is being populated into Redis or you can download the Redis Desktop Manager to help view the database.

viewing data in our ‘whatever’ list


We’ve officially added data into Redis from Python. Now lets make sure this data is appearing in Elasticsearch.

Jump over to the web console and set a dummy index of * for now.

For a pattern to test you can use *

Now check the discover tab and hit search to retrieve your data from Redis.


Cool! We can now see the data coming from Redis through Logstash and onto Elasticsearch.

This is a very rough start but hopefully it’s enough to get you interested in going further into the Elastic Stack. For further reading I would suggest looking into Logstash output for filtering logs and events and also time based graphing.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s