如何基于随机字段Logstash管理多行事件 [英] How to manage multiline events based on a random field Logstash

查看:133
本文介绍了如何基于随机字段Logstash管理多行事件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我最近遇到了与多行事件有关的问题,为此,我需要您的一点帮助.我的系统日志服务器正在发送多行事件.一个事件收集多行,并且证明特定事件行是多行事件的一部分的指标是定义用户连接会话的随机数.这是一个自定义生成的日志文件:

I've been facing a problem related to multiline events lately, and I am needing a little bit of your help for this. My syslog server is sending multi-line events. One single event gathers several lines, and the indicator that proves a particular event line is part of a multi-line event is a random number that defines a user connection session. Here is a custom generated log file:

Feb 16 17:29:04 slot1/APM-LTM notice apd[5515]: 01490010:5: 1ec2b273:Username 'cjones'

Feb 16 17:29:04 slot1/APM-LTM warning apd[5515]: 01490106:4: 1ec2b273: AD module: authentication with 'cjones' failed: Preauthentication failed, principal name: cjones@GEEKO.COM. Invalid user credentials. (-1765328360)

Feb 16 17:10:04 slot1/APM-LTM notice apd[5515]: 01490010:5: d8b5a591: Username 'gbridget'

Feb 16 17:10:04 slot1/APM-LTM err apd[5515]: 01490107:3: d8b5a591: AD module: authentication with 'gbridget' failed: Clients credentials have been revoked, principal name: gbridget@GEEKO.COM. User account is locked (-1765328366)

Feb 16 17:29:04 slot1/APM-LTM notice apd[5515]: 01490005:5: 1ec2b273: Following rule 'fallback' from item 'AD Auth' to ending 'Deny'

Feb 16 17:29:04 slot1/APM-LTM notice apd[5515]: 01490102:5: 1ec2b273: Access policy result: Logon_Deny

以上是与以下用户会话定义的两个不同连接相关的行:d8b5a591(用户gbridget)和1ec2b273(用户cjones).用户会话是将这些线路连接到两个不同事件的唯一指标.更不用说线路事件是相互交织的.

Above are the lines related to two different connections defined by the following user sessions: d8b5a591(user gbridget) and 1ec2b273(user cjones). user sessions are the only indicators that connect those lines to two different events. not to mention that the line events are interwined.

问题是,我对如何用多行插件解释grok过滤器一无所知,因为后者提供的选项太少了.实际上,上一行"和下一行"的概念在此处无法应用,因此,由于事件不一定是连续的,因此无法使用grok选项模式"和什么".

The problem is that I am at loss as to how to explain the above to grok filter with a multiline plugin, knowing that the latter offers too few options. In fact , the notion of "previous" and "next" line cannot be applied here for instance, so the grok options "pattern" and "what" cannot be used, since the events are not necessarily consecutive.

如果有人能对此有所了解并告诉我至少它是可行的,我将不胜感激.

I would really appreciate it if someone could shed some light on this and tell me if at least it is feasable or not.

推荐答案

我不将其视为多行事件,而将其视为相关事件.我会将它们作为6个不同的文档加载到elasticsearch中,然后根据需要进行查询.如果您要针对此数据执行特定查询,则可能会询问有关如何针对多个文档执行查询的问题.

I don't see those as multi-line events, but as related events. I would load them into elasticsearch as 6 different documents and then query as needed. If you have specific queries that you're trying to perform against this data, you might ask questions about how to perform them against multiple documents.

一种替代方法是使用session_id作为文档ID,然后在收到新信息时可以更新初始文档.他们不建议使用您自己的文档ID(出于性能原因,IIRC)并更新文档涉及删除旧的文档并插入新的文档,这也不利于性能.

One alternative would be to use the session_id as the document ids and then you could update the initial documents when new information came in. They don't recommend using your own document ids (for performance reasons, IIRC), and updating a document involves deleting the old one and inserting a new one, which is also not good for performance.

这篇关于如何基于随机字段Logstash管理多行事件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆