Skip to main content

ELK Docker

logomakr_5axvtc

This show how multiple containers can aggregate log to logging infrastructure with docker compose using logstash, elastic search and kibana

Run

you will need docker installed in your computer, after it :

  1. Run
docker-compose up
  1. run to get some logs from httpd
repeat 10 curl http://localhost:80/ 
  1. Kibana:

    • this might take a bit
    • Navigate to http://localhost:5601
    • Add logstash-* as index with @timestamp as Time-field name
    • Go to Discover
  2. Grafana:

    • Navigate to http://localhost:3000
    • username and password are admin
    • Navigate and create dashboards

Architecture

the target architecture would be to allow gathering information from applications but also sync with hadoop to enable having a data lake to improve analytics, and pull directly from google analytics to logstash.

image

(current architecture is missing google analytics and hadoop as seen in TODO list)

Backlog

  • Add kibana container
  • Add Grafana container visualization
  • Add elasticsearch container
  • Add logstash container and configuration
  • Add aplication containers
  • Add dashboard as code for grafana
  • Add import fron google analytics through logstash and http_poller
  • Add hadoop infrastructure for data analytics extension

References & further readings

Github Repo

Check out the new logo that I created on LogoMakr.com https://logomakr.com/5axvTc