Introduction to centralised logging with the elastic stack

Modern networking infrastructure is becoming increasingly complex. Most enterprise not only have a complex web of internal file, email, application and identity servers spanning the globe. Throw into this the hybrid cloud computing model and the ever expanding scope and reach of the Internet of Things (IoT).

Centralised Logging

Monitoring all computing resources and most importantly getting vital statistical data from them is a key objective and interest to most Enterprise organisations. This is especially true for devices and computing resources.  Many modern organisations regardless of size are geographically dispersed  which results in their computing resources to be dispersed, and the challenging to maintain control and monitor these devices.

In many cases organisations seek the ability to be able to quickly and easily ingest a number of different file formats from a wide variety of sources. This is also coupled with the fact that intended users of these systems will also like the ability to quickly and easily visualise this data for ease of use and understanding.

What is the Elastic Stack

This is the void that the Elastic Stack helps your organisation fill. The elastic stack is a free open source solution which your organisation set up and configure.  Elastic do offer a paid for cloud services which you can easily purchase on Azure, Google Cloud Computing and Amazon Web Services, we do recommend checking out these services.  However, this article will guide you through set up and configuring the elastic stack in your own environment, if you’re using your own virtual environment such as ProxMox

The ElasticStack can be run on any platforms, as for the most part it developed in Java and therefore makes use of the JVM. However, Elastic do recommend to use Linux.  Therefore for the purpose of this guide we’ll be using Ubuntu 16.04 Servers to install and configure the Elastic Stack.  You are free to use whichever distribution of your choice i.e. RedHat, Centos etc.

We will need to provision 3 servers.  Each to host specific components of the stack  : elasticsearch, Kibana and LogStash.


A distributed, RESTful search and analytics engine capable of solving a growing number of use cases. The heart of the Elastic Stack, it centrally stores both structured and unstructured data providing the ability to search discover the expected and uncover the unexpected.


Logstash is an open source, server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends to the Data Repository of your choice.


Enables the visualization of Elasticsearch data and navigation and management of the Elastic Stack.

Installing the elastic stack is relatively simple if you are familiar with linux terminal. In our case we installed the elastic stack on headless ubuntu 16.04 machines. There detailed instructions to installing the stack available on the website. However, if you’re need know how to include SSL Certificates in your authenticaion, you may want to check out Install and configure Elastic Stack on ubuntu

Getting Data to elastic stack

Once you’ve got your ElasticStack configured and installed as you desire, now the fun can begin. The Elastic Stack is extremely flexible and adaptable to meet almost any Big Data need. Getting data to the stack, is made extremely simple via the installation of lightweight data shippers, called Beats.

Beats provide alot of functionality out of the box and they are well worth checking out

  • FileBeats – Log files
  • MetricBeats – metric data
  • PacketBeats – network data
  • WinlogBeats – Windows Event Logs
  • HeartBeat – Uptime Monitoring

Use case scenarios

We have made use of the Elastic stack in 3 different use case scenarios.

Monitoring of mini webservers

We were looking for a quick & easy and cheap solution to monitor a number of mini web server running on Raspberry Pi’s responsible for providing Help manuals across distributed network. We just needed a mechanism to periodically report up time monitoring and also ship nginx logs files.

In this scenario we installed and configured FileBeats to ship the nginx file and the Heartbeats for the uptime monitoring.

Machine sensor data

Our second scenario involved a typical IoT scenario, in how to reliably deliver frequent tiny message data capturing machine sensor data, notifying the progress of specific jobs. The sensors where configured to log activity to a rolling log file. In this scenario we made use of FileBeats once again to ship data.

Door Entry control system

We were commisioned to develop a simple Door Access control system once again utilising the Raspberry Pi’s and once again we utlised the filebeats to ship log files.

The elastic stack has pretty much become our go to Centralised Logging system, most notably for its ease of use and scalabiltiy.

Follow Me