Kibana Logstash弹性搜索|无法搜索未索引字段 [英] Kibana Logstash ElasticSearch | Unindexed fields cannot be searched

查看:171
本文介绍了Kibana Logstash弹性搜索|无法搜索未索引字段的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在研究ELK堆栈并遇到问题.

I am exploring ELK stack and coming across an issue.

我已生成日志,将日志转发到logstash,日志采用JSON格式,因此仅使用Logstash配置中的JSON过滤器将它们直接推入ES,连接并启动指向ES的Kibana.

I have generated logs, forwarded the logs to logstash, logs are in JSON format so they are pushed directly into ES with only JSON filter in Logstash config, connected and started Kibana pointing to the ES.

Logstash配置:

Logstash Config:

 filter {
  json {
    source => "message"
  }

现在我为每天的日志创建了索引,而Kibana高兴地显示了来自所有索引的所有日志.

Now I have indexes created for each day's log and Kibana happily shows all of the logs from all indexes.

我的问题是:日志中有许多字段未启用/未在Kibana中进行过滤的索引.当我尝试将它们添加到Kibana的文件管理器中时,它说无法搜索未索引字段" .

My issue is: there are many fields in logs which are not enabled/indexed for filtering in Kibana. When I try to add them to the filer in Kibana, it says "unindexed fields cannot be searched".

注意:这些不是sys/apache日志.有JSON格式的自定义日志.

Note: these are not sys/apache log. There are custom logs in JSON format.

日志格式:

{"message":"ResponseDetails","@version":"1","@timestamp":"2015-05-23T03:18:51.782Z","type":"myGateway","file":"/tmp/myGatewayy.logstash","host":"localhost","offset":"1072","data":"text/javascript","statusCode":200,"correlationId":"a017db4ebf411edd3a79c6f86a3c0c2f","docType":"myGateway","level":"info","timestamp":"2015-05-23T03:15:58.796Z"}

像'statusCode','correlationId'这样的

字段没有被索引.有什么原因吗?

fields like 'statusCode', 'correlationId' are not getting indexed. Any reason why?

我是否需要将映射文件提供给ES,以要求它为所有或给定字段建立索引?

Do I need to give a Mapping file to ES to ask it to index either all or given fields?

推荐答案

您已更新了Kibana字段列表?

You've updated the Kibana field list?

  1. 基巴纳语.
  2. 设置.
  3. 重新加载字段列表.

较新版本:

  1. 基巴纳语
  2. 管理.
  3. 刷新右上角的图标.

这篇关于Kibana Logstash弹性搜索|无法搜索未索引字段的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆