将String转换为JSON,以便可以在Kibana/Elasticsearch中搜索 [英] Transform String into JSON so that it's searchable in Kibana/Elasticsearch

查看:732
本文介绍了将String转换为JSON,以便可以在Kibana/Elasticsearch中搜索的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在Windows机器上运行了ElasticsearchFilebeatKibana. Filebeat日志具有正确的日志文件,正在侦听路径.当我查看Kibana中的数据时,看起来不错.

I have Elasticsearch, Filebeat and Kibana running on a Windows machine. Filebeat log has a proper log file and is listening to the path. When I look on the data in Kibana it looks fine.

我的问题是message字段是一个字符串.

My issue is that the message field is a String.

一个日志行的示例:

12:58:09.9608 Trace {"message":"No more Excel rows found","level":"Trace","logType":"User","timeStamp":"2020-08-14T12:58:09.9608349+02:00","fingerprint":"226fdd2-e56a-4af4-a7ff-724a1a0fea24","windowsIdentity":"mine","machineName":"NAME-PC","processName":"name","processVersion":"1.0.0.1","jobId":"957ef018-0a14-49d2-8c95-2754479bb8dd","robotName":"NAME-PC","machineId":6,"organizationUnitId":1,"fileName":"GetTransactionData"}

所以我现在想要的是将String转换为JSON,以便可以在Kibana中搜索例如level字段.

So what I would like to have now is that String converted to a JSON so that it is possible to search in Kibana for example for the level field.

我已经看过Filebeat.我在那里尝试启用 LogStash .但是,这些数据不再提供给Elasticsearch.而且日志文件也不会生成到LogStash文件夹中.

I already had a look on Filebeat. There I tried to enable LogStash . But then the data does not come anymore to Elasticsearch. And also the log file is not genereated into the LogStash folder.

然后我通过安装指南下载了LogStash ,但不幸的是我收到了以下消息:

Then I downloaded LogStash via install guide, but unfortunately I got this message:

C:\Users\name\Desktop\logstash-7.8.1\bin>logstash.bat 
Sending
Logstash logs to C:/Users/mine/Desktop/logstash-7.8.1/logs which
is now configured via log4j2.properties ERROR: Pipelines YAML file is
empty. Location:
C:/Users/mine/Desktop/logstash-7.8.1/config/pipelines.yml usage:  
bin/logstash -f CONFIG_PATH [-t] [-r] [] [-w COUNT] [-l LOG]  
bin/logstash --modules MODULE_NAME [-M
"MODULE_NAME.var.PLUGIN_TYPE.PLUGIN_NAME.VARIABLE_NAME=VALUE"] [-t]
[-w COUNT] [-l LOG]   bin/logstash -e CONFIG_STR [-t] [--log.level
fatal|error|warn|info|debug|trace] [-w COUNT] [-l LOG]   bin/logstash
-i SHELL [--log.level fatal|error|warn|info|debug|trace]   bin/logstash -V [--log.level fatal|error|warn|info|debug|trace]  
bin/logstash --help
[2020-08-14T15:07:51,696][ERROR][org.logstash.Logstash    ]
java.lang.IllegalStateException: Logstash stopped processing because
of an error: (SystemExit) exit

我尝试仅使用Filebeat.我在这里设置:

I tried to use Filebeat only. Here I set:

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~
  - dissect: 
      tokenizer: '"%{event_time} %{loglevel} %{json_message}"' 
      field: "message" 
      target_prefix: "dissect"
  - decode_json_fields: 
      fields: ["json_message"]

但这给了我

dissect_parsing_error

dissect_parsing_error

移除"tokenizer的帮助.然后我得到了:

The tip with removing the "" at tokenizer helped. Then I got:

我只是刷新索引,消息消失了.很好.

I simply refreshed the index and the message was gone. Nice.

但是现在的问题是,如何在新字段中过滤某些内容?

But The question is now, how to filter for something in the new field?

推荐答案

该消息表明,您的管道配置为空.似乎您尚未配置任何管道. Logstash可以解决问题( JSON过滤器插件),但Filebeat在这里就足够了.如果您不想引入其他服务,这是更好的选择.

The message says, your pipeline config is empty. It seems you did not configured any pipeline yet. Logstash can do the trick (JSON filter plugin), but Filebeat is sufficient here. If you don't want to introduce another Service, this is the better option.

它具有decode_json_fields选项,可将事件中包含JSON的特定字段转换为.这是文档.

It has the decode_json_fields option to transform specific fields containing JSON in your event to a . Here is the documentation.

在将来的情况下,如果您的整个事件都是JSON,则可以在filebeat中进行解析,以配置json.message_key和相关的json.*选项.

For the future case, where your whole event is a JSON, there is the possibility of parsing in filebeat configuring the json.message_key and related json.* option.

编辑-添加文件拍点代码片段作为剖析的处理器示例将日志行分为三个字段(event_time,loglevel,json_message).之后,最近提取的字段json_message(其值为一个编码为字符串的JSON对象)将为

EDIT - Added filebeat snippet as an processors example of dissecting the log line into three fields (event_time, loglevel, json_message). Afterwards the recently extracted field json_message, whose value is a JSON object encoded as a string, will be decoded into an JSON structure:

 ... 

filebeat.inputs: 
  - type: log 
    paths: 
      - path to your logfile
  
processors: 
  - dissect: 
      tokenizer: '%{event_time} %{loglevel} %{json_message}' 
      field: "message" 
      target_prefix: "dissect"

  - decode_json_fields: 
      fields: ["dissect.json_message"]
      target: ""

  - drop_fields:
      fields: ["dissect.json_message"]


 ... 

如果您想练习Filebeat处理器,请尝试设置正确的事件时间戳,该时间戳取自编码的json,并使用

If you want to practice the filebeat processors, try to set the correct event timestamp, taken from the encoded json and written into @timestamp using the timestamp processor.

这篇关于将String转换为JSON,以便可以在Kibana/Elasticsearch中搜索的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆