在ELK中组合日志和查询 [英] Combine logs and query in ELK
问题描述
使用ELK(Elasticsearch-Logstash-Kibana)堆栈,我将syslog日志从* nix框收集到Logstash,并通过Elasticsearch发送到Kibana。这是经典的一种情况。
With ELK (Elasticsearch-Logstash-Kibana) stack, I collect syslog logs from *nix boxes to Logstash and send it to Kibana via Elasticsearch. This is classical one scenario.
我的系统日志记录包括正常的系统事件,squid访问日志,captiveportal登录日志等。
captiveportal记录为
My syslog log includes normal system events, squid access log, captiveportal login logs etc. captiveportal logged as
1423548430 2582 192.168.1.23 xx:ae:xx:e1:xx:99 mike.brown cc9aeb1210b39571 MTI= first
和
squid访问日志记录为:
squid access logs logged as:
1423562965.228 482 192.168.1.23 TCP_MISS/200 1254 POST http://ad4.liverail.com/? - DIRECT/31.13.93.12 text/xml
在Logstash中,我已经过滤了强制门户日志,而我已经得到 client_ip =192.168.1.23
, user_name =mike.brown
以及Logstash中的不同过滤器配置我也过滤了squid访问日志,我已经得到 src_ip =192.168.1.23
。
In Logstash, I have filtered captive portal log, and I have got client_ip="192.168.1.23"
, user_name="mike.brown"
and also in different filter in Logstash configuration I have also filtered squid access log, and I have got src_ip="192.168.1.23"
.
我的问题是:如何查询以获取user_name其中squid访问日志的client_ip是否等于Kibana中的被锁定门户的src_ip?
My question is: How can I query to get user_name where client_ip of squid access log equals to src_ip of captive portal in Kibana?
推荐答案
你不能在弹性搜索中进行连接。他们讨论了关于此文档中的关系的一些选项。
You can't do joins in elasticsearch. They discuss a few of the options for relationships in this doc.
这篇关于在ELK中组合日志和查询的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!