正在将slowlogs发送到.csv文件? [英] Sending slowlogs to .csv file?

查看:186
本文介绍了正在将slowlogs发送到.csv文件?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用logstash 2.4.0和logstash 2.4.0
我想使用logstash发送slowlogs到.csv输出文件。我的配置文件是这样的

  input {
file {
path => D:\ logstash-2.4.0\logstash-2.4.0\bin\rachu.log
start_position => begin
}
}

过滤器{
grok {
match => [message,

\ [%{TIMESTAMP_ISO8601:TIMESTAMP} \] \ [%{LOGLEVEL:LEVEL}%{SPACE} \] \ [%{DATA:QUERY } \]%{SPACE} \ [%{DATA:QUERY1} \]%{SPACE} \ [%{DATA:INDEX-NAME} \] \ [%{DATA:SHARD} \] %{SPACE} take\ [%{DATA:TOOK} \],%{SPACE} took_millis\ [%{DATA:TOOKM} \],types\ [%{DATA:types} \] stats \ [%{DATA:stats} \],search_type\ [%{DATA:search_type} \],total_shards \ [%{NUMBER:total_shards} \],source\ [%{DATA: source_query} \],extra_source\ [%{DATA:extra_source} \],]
}
}
输出{
csv {
fields = > [TIMESTAMP,LEVEL,QUERY,QUERY1,INDEX-NAME,SHARD,TOOK,TOOKM,types,stats,search_type,total_shards ,source_query,extra_source]
path => D:\logstash-2.4.0\logstash-2.4.0\bin\logoutput.csv
spreadsheet_safe => false
}

}


解决方案>

csv 过滤器在您的上下文中无用。它的目标是解析传入的CSV数据,但这不是你有。你需要的是先用 grok 过滤器解析日志行,然后才能将它正确发送到 csv 输出:

 过滤器{
grok {
match => {message=> \ [%{TIMESTAMP_ISO8601:TIMESTAMP} \] \ [%{LOGLEVEL:LOGLEVEL} \] \ [%{DATA:QUERY} \] \ [%{WORD:QUERY1} \] \\ [%{WORD:INDEX} \] \ [%{INT:SHARD} \] take \ [%{BASE10NUM:TOOK} ms\],took_millis \ [%{BASE10NUM:took_millis} \ ],types\ [%{DATA:types} \],stats \ [%{DATA:stats} \],search_type\ [%{DATA:search_type} \],total_shards \ [%{ INT:total_shards} \],source\ [%{DATA:source} \],extra_source\ [%{DATA:extra_source} \]}
}
}
输出{
csv {
fields => [TIMESTAMP,LOGLEVEL,QUERY,QUERY1,INDEX-NAME,SHARD,TOOK,took_millis,types,stats,search_type,total_shards ,source_query,extra_source]
path => F:\logstash-5.1.1\logstash-5.1.1\finaloutput1
spreadsheet_safe => false
}
}

注意:这在Logstash 5.1.1因为此公开问题。它应该很快就会修复,但在此期间,它适用于Logstash 2.4。


I am using logstash 2.4.0 and logstash 2.4.0 I want to send the slowlogs to .csv output file using logstash. my config file is like this

      input {
  file {
    path => "D:\logstash-2.4.0\logstash-2.4.0\bin\rachu.log"
    start_position => "beginning"
  }
}

filter {
   grok {
        match => [ "message", 

"\[%{TIMESTAMP_ISO8601:TIMESTAMP}\]\[%{LOGLEVEL:LEVEL}%{SPACE}\]\[%{DATA:QUERY}\]%{SPACE}\[%{DATA:QUERY1}\]%{SPACE}\[%{DATA:INDEX-NAME}\]\[%{DATA:SHARD}\]%{SPACE}took\[%{DATA:TOOK}\],%{SPACE}took_millis\[%{DATA:TOOKM}\], types\[%{DATA:types}\], stats\[%{DATA:stats}\],search_type\[%{DATA:search_type}\], total_shards\[%{NUMBER:total_shards}\], source\[%{DATA:source_query}\], extra_source\[%{DATA:extra_source}\],"]
   }
}
output {
   csv {
      fields => ["TIMESTAMP","LEVEL","QUERY","QUERY1","INDEX-NAME","SHARD","TOOK","TOOKM","types","stats","search_type","total_shards","source_query","extra_source"]
      path => "D:\logstash-2.4.0\logstash-2.4.0\bin\logoutput.csv"
      spreadsheet_safe => false
   }

}

解决方案

The csv filter is not useful in your context. Its goal is to parse incoming CSV data, but that's not what you have. What you need is to parse the log lines with a grok filter first and only then you'll be able to send it properly to the csv output:

filter {
   grok {
      match => {"message" => "\[%{TIMESTAMP_ISO8601:TIMESTAMP}\]\[%{LOGLEVEL:LOGLEVEL} \]\[%{DATA:QUERY}\] \[%{WORD:QUERY1}\] \[%{WORD:INDEX}\]\[%{INT:SHARD}\] took\[%{BASE10NUM:TOOK}ms\], took_millis\[%{BASE10NUM:took_millis}\], types\[%{DATA:types}\], stats\[%{DATA:stats}\], search_type\[%{DATA:search_type}\], total_shards\[%{INT:total_shards}\], source\[%{DATA:source}\], extra_source\[%{DATA:extra_source}\]"}
   }
}
output {
   csv {
      fields => ["TIMESTAMP","LOGLEVEL","QUERY","QUERY1","INDEX-NAME","SHARD","TOOK","took_millis","types","stats","search_type","total_shards","source_query","extra_source"]
      path => "F:\logstash-5.1.1\logstash-5.1.1\finaloutput1"
      spreadsheet_safe => false
   }
}

Note: this doesn't yet work on Logstash 5.1.1 because of this open issue. It should get fixed soon, but in the meantime this works on Logstash 2.4.

这篇关于正在将slowlogs发送到.csv文件?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆