将日志条目与logstash相结合 [英] Combining log entries with logstash

查看:165
本文介绍了将日志条目与logstash相结合的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想从dnsmasq收集和处理日志,我决定使用ELK。 Dnsmasq用作DHCP服务器和DNS解析器,因此它为这两个服务创建日志条目。

I want to collect and process logs from dnsmasq and I´ve decided to use ELK. Dnsmasq is used as a DHCP Server and as a DNS Resolver and hence it creates log entries for both services.

我的目标是将所有DNS查询与请求者一起发送到Elasticsearch IP,请求者主机名(如果可用)和请求者mac地址。这将允许我分配每个mac地址的请求,无论设备IP是否改变,并显示主机名。

My goal is to send to Elasticsearch all DNS Queries with the requester IP, requester hostname (if available) and requester mac address. That will allow me to group the request per mac address regardless if the device IP changed or not, and display the host name.

我想做的是以下:

1)阅读以下条目:

Mar 30 21:55:34 dnsmasq-dhcp[346]: 3806132383 DHCPACK(eth0)  192.168.0.80 04:0c:ce:d1:af:18 air

2)暂时存储关系:

192.168.0.80 => 04:0c:ce:d1:af: 18

192.168.0.80 => 04:0c:ce:d1:af:18

192.168.0.80 => 空气

192.168.0.80 => air

3)丰富下面的条目,添加mac地址和主机名。如果主机名为空,我将添加mac地址。

3) Enrich the entries like the one below adding the mac address and hostname. If the hostname was empty I would add the mac address.

Mar 30 22:13:05 dnsmasq[346]: query[A] imap.gmail.com from 192.168.0.80

我发现一个名为记忆,这将允许我存储它们,但遗憾的是不能使用最新版本的Logstash

I found a module called "memorize" that would allow me to store them but unfortunately does not work with the latest version of Logstash

我使用的版本:

ElastiSearch 2.3.0
Kibana 4.4.2
Logstash 2.2.2

而且logstash过滤器(这是我的首先尝试使用logstash,因此我可以确定配置文件可以改进)

And the logstash filter (this is my first attempt with logstash and hence I´m sure the configuration file can be improved)

input {
  file {
    path => "/var/log/dnsmasq.log"
    start_position => "beginning"
    type => "dnsmasq"
  }
}  

filter {
  if [type] == "dnsmasq" {
    grok {
      match =>  [ "message", "%{SYSLOGTIMESTAMP:reqtimestamp} %{USER:program}\[%{NONNEGINT:pid}\]\: ?(%{NONNEGINT:num} )?%{NOTSPACE:action} %{IP:clientip} %{MAC:clientmac} ?(%{HOSTNAME:clientname})?"]
      match =>  [ "message", "%{SYSLOGTIMESTAMP:reqtimestamp} %{USER:program}\[%{NONNEGINT:pid}\]\: ?(%{NONNEGINT:num} )?%{USER:action}?(\[%{USER:subaction}\])? %{NOTSPACE:domain} %{NOTSPACE:function} %{IP:clientip}"]
      match =>  [ "message", "%{SYSLOGTIMESTAMP:reqtimestamp} %{USER:program}\[%{NONNEGINT:pid}\]\: %{NOTSPACE:action} %{DATA:data}"]
    }

    if [action] =~ "DHCPACK" {

    }else if [action] == "query" {

    }else
    {
      drop{}
    }
  }
}
output {
  elasticsearch { hosts => ["localhost:9200"] }
  stdout { codec => rubydebug }
}

问题:

1)是否有替代插件memorize使用最新的logstash版本?另一个插件或不同的过程。

1) Is there an alternative to the plugin "memorize" working with the latest logstash version? Either another plugin or different procedure.

2)我是否会将logstash降级到2之前的版本(我以前是1.5.4)?如果是这样,是否有任何已知的服务器问题或与elasticsearch不兼容2.2.1?

2) Shall I downgrade logstash to a version before 2 (I think the previous is 1.5.4)? If so, is there any known sever issue or incompatibility with elasticsearch 2.2.1?

3)或者我将修改插件memorize,允许logstash 2.x(if所以我会感谢任何关于如何开始的指针)?

3) Or shall I modify the plugin "memorize" allowing logstash 2.x (if so I´ll appreciate any pointer on how to start)?

推荐答案

我没有必要重新包装 memorize 意见。您可以使用 聚合过滤器来实现你想要的。

There's no need to repack the memorize plugin for this in my opinion. You can use the aggregate filter to achieve what you want.

...

# record host/mac in temporary map
if [action] =~ "DHCPACK" {
  aggregate {
     task_id => "%{clientip}"
     code => "map['clientmac'] = event['clientmac']; map['clientname'] = event['clientname'];"
     map_action => "create_or_update"
     # timeout set to 48h
     timeout => 172800
  }
}

# add host/mac where/when needed
else if [action] == "query" {
   aggregate {
     task_id => "%{clientip}"
     code => "event['clientmac'] = map['clientmac']; event['clientname'] = map['clientname']"
     map_action => "update"
   }
}

这篇关于将日志条目与logstash相结合的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆