从Java应用程序登录到ELK,而不需要解析日志 [英] Logging from Java app to ELK without need for parsing logs

查看:451
本文介绍了从Java应用程序登录到ELK,而不需要解析日志的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想将日志从Java应用程序发送到ElasticSearch,传统方法似乎是在运行该应用程序的服务器上设置Logstash,并使用logstash来解析日志文件(使用正则表达式!!)并加载他们进入ElasticSearch。

I want to send logs from a Java app to ElasticSearch, and the conventional approach seems to be to set up Logstash on the server running the app, and have logstash parse the log files (with regex...!) and load them into ElasticSearch.

有没有理由这样做,而不仅仅是设置log4J(或logback)以所需格式直接将日志记录到日志收集器那么可以异步地发送到ElasticSearch?当应用程序本身可以首先登录所需的格式时,让我不得不解决grok过滤器来处理多行堆栈跟踪(并在日志解析时刻录CPU周期)?

Is there a reason it's done this way, rather than just setting up log4J (or logback) to log things in the desired format directly into a log collector that can then be shipped to ElasticSearch asynchronously? It seems crazy to me to have to fiddle with grok filters to deal with multiline stack traces (and burn CPU cycles on log parsing) when the app itself could just log it the desired format in the first place?

在切向相关的笔记中,对于在Docker容器中运行的应用程序,最佳做法是直接登录到ElasticSearch,因为只需要运行一个进程?

On a tangentially related note, for apps running in a Docker container, is best practice to log directly to ElasticSearch, given the need to run only one process?

推荐答案

我认为通常不建议从Log4j / Logback /任何appender直接登录到Elasticsearch,但我同意编写Logstash过滤器来解析正常人可读的Java日志也是一个坏主意。我在任何地方使用 https://github.com/logstash/log4j-jsonevent-layout 可以让Log4j的常规文件追加器生成不需要Logstash进一步解析的JSON日志。

I think it's usually ill-advised to log directly to Elasticsearch from a Log4j/Logback/whatever appender, but I agree that writing Logstash filters to parse a "normal" human-readable Java log is a bad idea too. I use https://github.com/logstash/log4j-jsonevent-layout everywhere I can to have Log4j's regular file appenders produce JSON logs that don't require any further parsing by Logstash.

这篇关于从Java应用程序登录到ELK,而不需要解析日志的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆