logstash _grokparsefailure问题 [英] logstash _grokparsefailure issues
问题描述
我在进行grok解析时遇到问题. 在ElasticSearch/Kibana中,我匹配的行带有标记_grokparsefailure.
这是我的logstash配置:
input {
file {
type => logfile
path => ["/var/log/mylog.log"]
}
}
filter {
if [type] == "logfile"
{
mutate {
gsub => ["message","\"","'"]
}
grok
{ match => { "message" => "L %{DATE} - %{TIME}: " } }
}
}
output {
elasticsearch { host => localhost port => 9300 }
}
我要匹配的
行/模式: L 08/02/2014-22:55:49:日志文件关闭:完成"
我在 http://grokdebug.herokuapp.com/上尝试了调试器,并且运行良好,我的模式正确匹配. /p>
我想解析的行可能包含双引号,而且我读过grok句柄和转义符的方式可能存在问题. 因此,为了避免出现问题,但我没有运气,我试图进行突变以将替换为".
有什么想法吗? 我该如何调试呢?
谢谢
发现问题所在,因为它是用双引号引起来的.
需要使用简单的引号定义grok过滤器,并转义双引号.
match => { 'message' => 'L %{DATE:date} - %{TIME:time}: \"string_between_doublequotes\" '
I'm having issues with grok parsing. In ElasticSearch/Kibana the lines I match come up with the tag _grokparsefailure.
Here is my logstash config :
input {
file {
type => logfile
path => ["/var/log/mylog.log"]
}
}
filter {
if [type] == "logfile"
{
mutate {
gsub => ["message","\"","'"]
}
grok
{ match => { "message" => "L %{DATE} - %{TIME}: " } }
}
}
output {
elasticsearch { host => localhost port => 9300 }
}
lines/patterns I'm trying to match : L 08/02/2014 - 22:55:49: Log file closed : " finished "
I tried the debugger on http://grokdebug.herokuapp.com/ and it works fine, my pattern matches correctly.
Lines I want to parse might contain double quotes, and I've read there can be issues with the way grok handles and escapes them. So I tried to mutate to replace " with ' to avoid issues but no luck.
Any ideas ? How can I debug this ?
Thanks
Found out the issue, it was around double quotes.
Needed to use simple quote to define the grok filter, and escape double quotes.
match => { 'message' => 'L %{DATE:date} - %{TIME:time}: \"string_between_doublequotes\" '
这篇关于logstash _grokparsefailure问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!