Central logging in the dockersphere
As I recently decided to go all docker on a VPS I have. I also wanted to have a central logging instance with elasticsearch as backend. The mail service container I was considering already had some filebeat support for elasticsearch but I wanted to be able to collect the logs from all running containers and ship it to elasticsearch.
The docker daemon has gained some fancy logging driver support. So I decided to use fluentd
to directly ship the logs to elasticsearch and skip the logstash step. I understood after some google-fu that fluentd
also understands grok patterns to enrich the data so that should be possible...
First I needed a fluentd container It is a trivial build only the elasticsearch plugin is needed.
# fluentd/Dockerfile
FROM fluent/fluentd:stable
RUN ["gem", "install", "fluent-plugin-elasticsearch", "--no-rdoc", "--no-ri" ]
So I added the following to my docker-compose.yaml
####
# fluentd for log capture
fluentd:
build: ./fluentd
volumes:
- ${DOCKERDIR}/fluentd/config:/fluentd/etc
links:
- elk
ports:
- "127.0.0.1:24224:24224"
- "127.0.0.1:24224:24224/udp"
To prevent external access to the fluentd I prefixed the ports with the localhost
ip address. The docker daemon logging driver for fluentd logs by default to localhost.
The config for fluentd is mounted into the container
# fluentd/conf/fluent.conf
<source>
@type forward
port 24224
bind 0.0.0.0
</source>
<match *.**>
@type copy
<store>
@type elasticsearch
host elk
port 9200
logstash_format true
logstash_prefix docker
logstash_dateformat %Y%m%d
include_tag_key true
tag_key @log_name
flush_interval 1s
</store>
# <store>
# @type stdout
# </store>
</match
For a container to log to this fluentd service you only have to configure that the docker daemon uses the fluentd driver:
a_container:
...
links:
- fluentd
logging:
driver: "fluentd"
...
UPDATE: I could not easily reused all the work that had been done by others for the postfix
logstash patterns/conf. So I dropped the fluentd
container and used the syslog
driver to ship the logging to logstash
. I will soon write up my notes about this.