使用filebeat,logstash和elasticsearch将json格式的日志发送到kibana? [英] Sending json format log to kibana using filebeat, logstash and elasticsearch?

查看:371
本文介绍了使用filebeat,logstash和elasticsearch将json格式的日志发送到kibana?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有这样的日志:

{"logId":"57aaf6c8d32fb","clientIp":"127.0.0.1","time":"03:11:29 pm","uniqueSubId":"57aaf6c98963b","channelName":"JSPC","apiVersion":"v1","modulName":null,"actionName":"apiRequest","typeOfError":"","statusCode":"","message":"In Auth","exception":"In Auth","logType":"Info"}

{"logId":"57aaf6c8d32fb","clientIp":"127.0.0.1","time":"03:11:29 pm","uniqueSubId":"57aaf6c987206","channelName":"JSPC","apiVersion":"v2","modulName":null,"actionName":"performV2","typeOfError":"","statusCode":"","message":"in inbox api v2 5","exception":"in inbox api v2 5","logType":"Info"}

我想将它们推到kibana.我正在使用filebeat通过以下配置将数据发送到logstash:

I want to push them to kibana. I am using filebeat to send data to logstash, using following configuration:

filebeat.yml

 ### Logstash as output
logstash:
# The Logstash hosts
hosts: ["localhost:5044"]

# Number of workers per Logstash host.
#worker: 1

现在使用以下配置,我想更改编解码器类型:

Now using following configuration, I want to change codec type:

input {

     beats {
     port => 5000
     tags => "beats"
     codec => "json_lines"
     #ssl  => true
     #ssl_certificate => "/opt/filebeats/logs.example.com.crt"
     #ssl_key => "/opt/filebeats/logs.example.com.key"
     }


     syslog {
        type => "syslog"
        port => "5514"

    }

}

但是,我仍然以字符串格式获取日志:

But, still I get the logs in string format:

消息": "{\" logId \:\" 57aaf6c96224b \,\" clientIp \:\" 127.0.0.1 \,\" time \:\" 03:11:29 pm \,\" channelName \:\" JSPC \,\" apiVersion \:null,\" modulName \:null,\" actionName \:\" 404 \,\" typeOfError \: \"EXCEPTION \",\"statusCode \":0,\"message \":\"404 遇到页面 http:\/\/localjs.com \/uploads \/NonScreenedImages \/profilePic120 \/16 \/29 \/15997002iicee52ad041fed55e952d4e4e163d5972ii4c41f8845105429abbd11cc184d0e330.jpeg \,\" logType \:"

"message": "{\"logId\":\"57aaf6c96224b\",\"clientIp\":\"127.0.0.1\",\"time\":\"03:11:29 pm\",\"channelName\":\"JSPC\",\"apiVersion\":null,\"modulName\":null,\"actionName\":\"404\",\"typeOfError\":\"EXCEPTION\",\"statusCode\":0,\"message\":\"404 page encountered http:\/\/localjs.com\/uploads\/NonScreenedImages\/profilePic120\/16\/29\/15997002iicee52ad041fed55e952d4e4e163d5972ii4c41f8845105429abbd11cc184d0e330.jpeg\",\"logType\":\"Error\"}",

请帮助我解决这个问题.

Please help me solve this.

推荐答案

要在Logstash中解析从Filebeat发送的JSON日志行,您需要使用

To parse JSON log lines in Logstash that were sent from Filebeat you need to use a json filter instead of a codec. This is because Filebeat sends its data as JSON and the contents of your log line are contained in the message field.

Logstash配置:

Logstash config:

input {
  beats {
    port => 5044
  }   
}   

filter {
  if [tags][json] {
    json {
      source => "message"
    }   
  }   
}   

output {
  stdout { codec => rubydebug { metadata => true } } 
}

Filebeat配置:

Filebeat config:

filebeat:
  prospectors:
    - paths:
        - my_json.log
      fields_under_root: true
      fields:
        tags: ['json']
output:
  logstash:
    hosts: ['localhost:5044']

在Filebeat配置中,我向事件添加了"json"标签,以便可以有条件地将json过滤器应用于数据.

In the Filebeat config, I added a "json" tag to the event so that the json filter can be conditionally applied to the data.

Filebeat 5.0能够在不使用Logstash的情况下解析JSON,但是目前它仍然是Alpha版本.这篇标题为使用Filebeat的结构化日志记录的博客文章演示了如何使用Filebeat 5.0解析JSON.

Filebeat 5.0 is able to parse the JSON without the use of Logstash, but it is still an alpha release at the moment. This blog post titled Structured logging with Filebeat demonstrates how to parse JSON with Filebeat 5.0.

这篇关于使用filebeat,logstash和elasticsearch将json格式的日志发送到kibana?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆