如何从文本文件行中的logstash / grok中解析json? [英] How to parse json in logstash /grok from a text file line?
问题描述
我有一个这样的日志文件(简化)
Logline示例
MyLine data = {firstname:bob,lastname:生成器}
我想提取数据中包含的json,并创建两个字段,一个用于名字,一个用于最后一个。但是,我得到的输出是:
{message:行数据= {\firstname\ :\bob\,\lastname\:\builder \} \r,@ version:1,@ timestamp:2015-11- 26T11:38:56.700Z, 宿主: XXX, 路径: C:/logstashold/bin/input.txt, myWord 的: 线, parsedJson:{ 姓: bob,lastname:建设者}}
正如你所看到的那样>
...parsedJson:{firstname:bob,lastname:生成器}}
这不是我需要的,我需要在kibana中创建firstname和lastname的字段,但logstash不是提取使用json过滤器的字段。
LogStash配置
input {
file {
path => C:/logstashold/bin/input.txt
}
}
过滤器{
grok {
match => {message=> %{WORD:MyWord} data =%{GREEDYDATA:request}}
}
json {
source => request
target => parsedJson
remove_field => [request]
}
}
输出{
文件{
path => C:/logstashold/bin/output.txt
}
}
任何帮助非常感谢,我相信我错过了一些简单的
谢谢
在您的 json
过滤器添加另一个名为 mutate
以添加两个字段您将从 parsedJson
字段中获取。
filter {
...
json {
...
}
mutate {
add_field => {
firstname=> %{[parsedJson] [firstname]}
lastname=> %{[parsedJson] [lastname]}
}
}
}
对于您上面提供的样本日志行:
{
message=> MyLine data = {\firstname\:\bob\,\lastname\:\builder \},
@version=> ; 1,
@timestamp=> 2015-11-26T11:54:52.556Z,
host=> iMac.local,
MyWord=> MyLine,
parsedJson=> {
firstname=> bob,
lastname=> 生成器
},
firstname=> bob,
lastname=> 建设者
}
I have a logfile which looks like this ( simplified)
Logline sample
MyLine data={"firstname":"bob","lastname":"the builder"}
I'd like to extract the json contained in data and create two fields, one for firstname, one for last. However, the ouput i get is this:
{"message":"Line data={\"firstname\":\"bob\",\"lastname\":\"the builder\"}\r","@version":"1","@timestamp":"2015-11-26T11:38:56.700Z","host":"xxx","path":"C:/logstashold/bin/input.txt","MyWord":"Line","parsedJson":{"firstname":"bob","lastname":"the builder"}}
As you can see
..."parsedJson":{"firstname":"bob","lastname":"the builder"}}
That's not what I need, I need to create fields for firstname and lastname in kibana, but logstash isn't extracting the fields out with the json filter.
LogStash Config
input {
file {
path => "C:/logstashold/bin/input.txt"
}
}
filter {
grok {
match => { "message" => "%{WORD:MyWord} data=%{GREEDYDATA:request}"}
}
json{
source => "request"
target => "parsedJson"
remove_field=>["request"]
}
}
output {
file{
path => "C:/logstashold/bin/output.txt"
}
}
Any help greatly appreciated, I'm sure I'm missing out something simple
Thanks
After your json
filter add another one called mutate
in order to add the two fields that you would take from the parsedJson
field.
filter {
...
json {
...
}
mutate {
add_field => {
"firstname" => "%{[parsedJson][firstname]}"
"lastname" => "%{[parsedJson][lastname]}"
}
}
}
For your sample log line above that would give:
{
"message" => "MyLine data={\"firstname\":\"bob\",\"lastname\":\"the builder\"}",
"@version" => "1",
"@timestamp" => "2015-11-26T11:54:52.556Z",
"host" => "iMac.local",
"MyWord" => "MyLine",
"parsedJson" => {
"firstname" => "bob",
"lastname" => "the builder"
},
"firstname" => "bob",
"lastname" => "the builder"
}
这篇关于如何从文本文件行中的logstash / grok中解析json?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!