Grok解析多行消息时解析错误 [英] Grok parse error while parsing multiple line messages

查看:235
本文介绍了Grok解析多行消息时解析错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图找出grok模式来解析多个消息,例如异常跟踪&下面是一个这样的日志

  2017-03-30 14:57:41 [12345] [qtp1533780180-12]错误com。 app.XYZ  - 处理
时发生异常java.lang.NullPointerException:null
在spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:162)
在spark.webserver.JettyHandler.doHandle (JettyHandler.java:61)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:189)
at org.eclipse.jetty.server.handler.ScopedHandler.handle (ScopedHandler.java:141)
在org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:119)
在org.eclipse.jetty.server.Server.handle(Server .java:517)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:302)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:242 )
在org.eclipse.jetty.io.AbstractConnection $ ReadCallback.succeeded(AbstractConnection.java:245)
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at org.eclipse.jetty.io.SelectChannelEndPoint $ 2.run(SelectChannelEndPoint.java:75)
at org。 eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:213)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:147)
在org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
在org.eclipse.jetty.util.thread.QueuedThreadPool $ 3.run(QueuedThreadPool.java:572)
at java.lang.Thread.run(Thread.java:745)

这是我的logstash .conf

  input {
file {
path => [/debug.log]
codec =>多行{
#Grok模式名称是有效的! :)
pattern => ^%{TIMESTAMP_ISO8601}
negate => true
what =>前




过滤器{

mutate {
gsub => [message,r,]
}
grok {
match => [message,%{TIMESTAMP_ISO8601:timestamp} \ [%{NOTSPACE:uid} \] \ [%{NOTSPACE:thread} \]%{LOGLEVEL:loglevel}%{DATA:class} \ - %{GREEDYDATA:message}]
overwrite => [message]
}
日期{
match => [timestamp,yyyy-MM-dd HH:mm:ss]
}
}


输出{
elasticsearch {hosts = > localhost}
stdout {codec => rubydebug}
}

这对于单行日志解析很有效,但在



0]_grokparsefailure

用于多行异常跟踪



可以有人请建议我正确的过滤器模式解析多行日志?

解决方案

上面的logstash配置工作正常后, p>

mutate {
gsub => [message,r,]
}
$ b $因此,工作logstash配置解析单行&以上日志模式的多行输入

  input {
file {
path => [./debug.log]
codec =>多行{
#Grok模式名称是有效的! :)
pattern => ^%{TIMESTAMP_ISO8601}
negate => true
what =>以前的
}
}
}

过滤器{
grok {
match => [message,%{TIMESTAMP_ISO8601:timestamp} \ [%{NOTSPACE:uid} \] \ [%{NOTSPACE:thread} \]%{LOGLEVEL:loglevel}%{DATA:class} \ - %{GREEDYDATA:message}]
overwrite => [message]
}
日期{
match => [timestamp,yyyy-MM-dd HH:mm:ss]
}
}


输出{
elasticsearch {hosts = > localhost}
stdout {codec => rubydebug}
}


I am trying to figure out grok pattern for parsing multiple messages like exception trace & below is one such log

2017-03-30 14:57:41 [12345] [qtp1533780180-12] ERROR com.app.XYZ - Exception occurred while processing
java.lang.NullPointerException: null
        at spark.webserver.MatcherFilter.doFilter(MatcherFilter.java:162)
        at spark.webserver.JettyHandler.doHandle(JettyHandler.java:61)
        at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:189)
        at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
        at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:119)
        at org.eclipse.jetty.server.Server.handle(Server.java:517)
        at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:302)
        at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:242)
        at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:245)
        at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
        at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:75)
        at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:213)
        at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:147)
        at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
        at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
        at java.lang.Thread.run(Thread.java:745)

Here is my logstash.conf

    input {
  file {
    path => ["/debug.log"]
    codec => multiline {
      # Grok pattern names are valid! :)
      pattern => "^%{TIMESTAMP_ISO8601} "
      negate => true
      what => previous
    }
  }
}

filter {

  mutate {
    gsub => ["message", "r", ""]
  }
  grok {
    match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NOTSPACE:uid}\] \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:class}\-%{GREEDYDATA:message}" ]
    overwrite => [ "message" ]
  }
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
  }
}


output {
  elasticsearch { hosts => localhost }
  stdout { codec => rubydebug }
}

This works fine for single line logs parsing but fails in

0] "_grokparsefailure"

for multiline exception traces

Can someone please suggest me the correct filter pattern for parsing multiline logs ?

解决方案

The above logstash config worked fine after removing

mutate { gsub => ["message", "r", ""] }

So the working logstash config for parsing single line & multi line inputs for the above log pattern

input {
  file {
    path => ["./debug.log"]
    codec => multiline {
      # Grok pattern names are valid! :)
      pattern => "^%{TIMESTAMP_ISO8601} "
      negate => true
      what => previous
    }
  }
}

filter {
  grok {
    match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} \[%{NOTSPACE:uid}\] \[%{NOTSPACE:thread}\] %{LOGLEVEL:loglevel} %{DATA:class}\-%{GREEDYDATA:message}" ]
    overwrite => [ "message" ]
  }
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
  }
}


output {
  elasticsearch { hosts => localhost }
  stdout { codec => rubydebug }
}

这篇关于Grok解析多行消息时解析错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆