Docker:将正在容器中写入的日志文件发送到ELK堆栈 [英] Docker: Ship log files being written inside containers to ELK stack

查看:56
本文介绍了Docker:将正在容器中写入的日志文件发送到ELK堆栈的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 docker 运行 django 应用程序,并使用 python 登录 django 设置以编写api在日志文件夹中记录日志.重新启动容器时,我的日志文件也将被删除(这是可以理解的).

我想将我的日志(例如/path/to/workdir/logs/django.log )发送到 elasticsearch .我很困惑,因为我的搜索告诉我要发送此路径/var/lib/docker/containers/*/*.log ,但我认为这不是我想要的.

关于我如何将容器中的日志运送到ELK Stack的任何想法?

解决方案

您可以将日志从 docker 容器 stdout / stderr 运送到<使用

I am running a django application using docker, and using python logging in django settings to write api logs inside a logs folder. When I restart my container my log files are also removed (which is understandable).

I would like to ship my logs (e.g. /path/to/workdir/logs/django.log) to elasticsearch. I am confused since my searches tell me to ship this path /var/lib/docker/containers/*/*.log but I don't think this is what I want.

Any ideas on how I ship my logs inside the container to ELK Stack?

解决方案

You can ship logs from docker containers stdout / stderr to elasticsearch using the gelf logging driver.

Configure the services to with the gelf logging driver (docker-compose.yml):

version: '3.7'
x-logging:
  &logstash
  options:
    gelf-address: "udp://localhost:12201"
  driver: gelf
services:
  nginx:
    image: 'nginx:1.17.3'
    hostname: 'nginx'
    domainname: 'example.com'
    depends_on:
    - 'logstash'
    ports:
    - '80:80'
    volumes:
    - '${PWD}/nginx/nginx.conf:/etc/nginx/nginx.conf:ro'
    logging: *logstash
  elasticsearch:
    image: 'elasticsearch:7.1.1'
    environment:
    - 'discovery.type=single-node'
    volumes:
    - 'elasticsearch:/usr/share/elasticsearch/data'
    expose:
    - '9200'
    - '9300'
  kibana:
    image: 'kibana:7.1.1'
    depends_on:
    - 'elasticsearch'
    ports:
    - '5601:5601'
    volumes:
    - '${PWD}/kibana/kibana.yml:/usr/share/kibana/config/kibana.yml'
  logstash:
    build: 'logstash'
    depends_on:
    - 'elasticsearch'
    volumes:
    - 'logstash:/usr/share/logstash/data'
    ports:
    - '12201:12201/udp'
    - '10514:10514/udp'
volumes:
  elasticsearch:
  logstash:

Note: the above example configures the logging using extension fields.

The minimal nginx.conf used for this example:

user nginx;
worker_processes 1;
error_log /var/log/nginx/error.log debug;

pid /var/run/nginx.pid;

events {
    worker_connections  1024;
}

http {
  server {
    listen 80;
    server_name _;

    location / {
      return 200 'OK';
    }
  }
}

The logstash image is a custom build using the below Dockerfile:

FROM logstash:7.1.1

USER 0
COPY pipeline/gelf.cfg /usr/share/logstash/pipeline
COPY pipeline/pipelines.yml /usr/share/logstash/config
COPY settings/logstash.yml /usr/share/logstash/config
COPY patterns /usr/share/logstash/patterns

RUN rm /usr/share/logstash/pipeline/logstash.conf
RUN chown -R 1000:0 /usr/share/logstash/pipeline /usr/share/logstash/patterns /usr/share/logstash/config
USER 1000

... the relevant logstash gelf plugin config:

input {
  gelf {
    type => docker
    port => 12201
  }
}

filter { }

output {
  if [type] == "docker" {
    elasticsearch { hosts => ["elasticsearch:9200"] }
    stdout { codec => rubydebug }
  }
}

... and pipelines.yml:

- pipeline.id: "gelf"
  path.config: "/usr/share/logstash/pipeline/gelf.cfg"

... and logstash.yml to persist the data:

queue:
  type: persisted
  drain: true

The process running in the container logs to stdout / stderr, docker pushes the logs to logstash using the gelf logging driver (note: the logstash address is localhost because the docker service discovery is not available to resolve the service name - the ports must be mapped to the host and the logging driver must be configured using localhost) which outputs the logs to elasticsearch that you can index in kibana:

这篇关于Docker:将正在容器中写入的日志文件发送到ELK堆栈的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆