在logtash中使用来自Kafka的字段的条件->发出问题FileBeat探矿者 [英] Issue with conditionals in logstash with fields from Kafka ----> FileBeat prospectors

查看:186
本文介绍了在logtash中使用来自Kafka的字段的条件->发出问题FileBeat探矿者的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下情况:

FileBeat ----> Kafka -----> Logstash -----> Elastic ----> Kibana

FileBeat ----> Kafka -----> Logstash -----> Elastic ----> Kibana

在Filebeat中,YML文件中有2位探矿者.并且我添加了一些字段来标识日志数据. 但是,问题是:在Logstash中,我无法验证此字段.

In Filebeat I have 2 prospectors the in YML file,,. and I add some fields to identify the log data. But, the issue is: in Logstash I haven't be able to validate this fields.

配置文件为:

1. filebeat.yml

filebeat.prospectors:
- input_type: log
  paths:
    - /opt/jboss/server.log*
  tags: ["log_server"]
  fields:
    environment: integracion
    log_type: log_server

  document_type: log_server
  fields_under_root: true


- input_type: log
  paths:
    - /var/todo1_apps/ebanTX.log*
  tags: ["log_eban"]
  fields:
    environment: integracion
    log_type: log_ebanking

  document_type: log_ebanking
  fields_under_root: true

output.kafka:
    enabled: true
    hosts: ["192.168.105.68:9092"]
    topic: "sve_logs"
    timeout: 30s

2. logstash.conf

input {
  kafka {
    bootstrap_servers => "192.xxx.xxx.xxx:9092"
    group_id => "sve_banistmo"
    topics => ["sve_logs"]
    decorate_events => true
    codec => "plain"
    }
  }

filter {
if [type] == "log_ebanking" {
    grok {
       patterns_dir => ["patterns/patterns"]
        match => { "message" => "%{TIMESTAMP_ISO8601:logdate}%{SPACE}%{LOGLEVEL:level}%{SPACE}\[%{DATA:thread}]%{SPACE}-%{SPACE}%{GREEDYDATA:message_log}" }
        }
  }
}

output {
if [type] == "log_ebanking" {
      elasticsearch {
        hosts => ["192.168.105.67:9200"]
        index => "sve-banistmo-ebanking-%{+YYYY.MM.dd}"
      }
        stdout { codec => json}
  }
}

问题出在条件过滤器输出部分.我尝试过

The problem is in the conditional filter and output section. I've tried with

@[metadata][type]
@metadata][type]
@metadata.type
metadata.type
[type]

同时具有type和log_type变量.没用! :S 如果我不附加条件,则数据流不会出现问题.我的意思是,这不是连接问题.

With both the type and log_type variable. Nothing works !! :S If I don't put conditionals, the data flow without problem. I mean, is not a conection issue.

请帮助我.我已经审查了所有相关信息,但就我而言,该条件不起作用.

Please help me. I've reviewed all the information related, but in my case the conditional doesn't work.

预先感谢

Dario R

推荐答案

使用codec => "json"logstash.conf kafka输入conf中的消息中提取所有字段.

Use codec => "json" to extract all fields from the message in logstash.conf kafka input conf.

这篇关于在logtash中使用来自Kafka的字段的条件->发出问题FileBeat探矿者的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆