Prismatic Digital Blog

Easy Docker container logging with Filebeat on AWS

Getting logs from docker is not hard but aggreagating them in one central place even for short running containers is not easy. This article will explain you how to get this done in 10 minutes with AWS ECS

Requirements:

  • A running container on a linux machine (ECS or other)
  • A running elasticsearch instance accessible over HTTP(s). Can be a local one or a aws elasticsearch service (free tier is fine)

# Download filebeat from the elasticsearch website (you need version 5.0+)
curl -O https://download.elastic.co/beats/filebeat/filebeat-5.0.0-alpha5-linux-x86_64.tar.gz
# Unzip the downloaded archive 
tar -xvf filebeat-5.0.0-alpha5-linux-x86_64.tar.gz
# Create a symlink to avoid the very verbose name
ln -s filebeat-5.0.0-alpha5-linux-x86_64/ filebeat
Edit the filebeat.yml file
filebeat.prospectors:
# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.
- input_type: log
# Paths that should be crawled and fetched. Glob based paths. Docker logs are in /var/lib/docker by default
paths:
- /var/lib/docker/containers/*/*.log
# Docker logs are in json, filebeat can autoextract them
json.keys_under_root: true
json.message_key: log
json.add_error_key: true
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["https://search-prismatic-asddsasdadsadsadsadsa.ap-southeast-2.es.amazonaws.com:443"]
# Optional protocol and basic auth credentials.
#protocol: "https"
#username: "elastic"
#password: "changeme"
# optional have the logs of all your container on disk as well
output.file:
path: "/tmp/filebeat"
filename: filebeat.log
rotate_every_kb: 10000
number_of_files: 7

That's all, your logs are now in Elasticsearch you can visualize them with Kibana