ElasticSearch索引编制问题,无法解析时间戳 [英] ElasticSearch indexing issue ,failed to parse timestamp

查看:140
本文介绍了ElasticSearch索引编制问题,无法解析时间戳的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是ELK的新手.我已经在Elasticsearch中创建了索引

i am new to ELK . i have created index in Elasticsearch

{
  "logstash": {
    "aliases": {},
    "mappings": {
      "log": {
        "dynamic_templates": [
          {
            "message_field": {
              "path_match": "message",
              "match_mapping_type": "string",
              "mapping": {
                "norms": false,
                "type": "text"
              }
            }
          },
          {
            "string_fields": {
              "match": "*",
              "match_mapping_type": "string",
              "mapping": {
                "fields": {
                  "keyword": {
                    "type": "keyword"
                  }
                },
                "norms": false,
                "type": "text"
              }
            }
          }
        ],
        "properties": {
          "@timestamp": {
            "type": "date"
          },
          "@version": {
            "type": "keyword",
            "include_in_all": false
          },
          "activity": {
            "type": "text",
            "norms": false,
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          },
          "beat": {
            "properties": {
              "hostname": {
                "type": "text",
                "norms": false,
                "fields": {
                  "keyword": {
                    "type": "keyword"
                  }
                }
              },
              "name": {
                "type": "text",
                "norms": false,
                "fields": {
                  "keyword": {
                    "type": "keyword"
                  }
                }
              },
              "version": {
                "type": "text",
                "norms": false,
                "fields": {
                  "keyword": {
                    "type": "keyword"
                  }
                }
              }
            }
          },
          "filename": {
            "type": "text",
            "norms": false,
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          },
          "host": {
            "type": "text",
            "norms": false,
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          },
          "input_type": {
            "type": "text",
            "norms": false,
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          },
          "message": {
            "type": "text",
            "norms": false
          },
          "offset": {
            "type": "long"
          },
          "source": {
            "type": "text",
            "norms": false,
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          },
          "tags": {
            "type": "text",
            "norms": false,
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          },
          "timestamp": {
            "type": "date",
            "include_in_all": false,
            "format": "YYYY-MM-DD HH:mm:ss.SSS"
          },
          "type": {
            "type": "text",
            "norms": false,
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          },
          "user": {
            "type": "text",
            "norms": false,
            "fields": {
              "keyword": {
                "type": "keyword"
              }
            }
          }
        }
      }
    },
    "settings": {
      "index": {
        "creation_date": "1488805244467",
        "number_of_shards": "1",
        "number_of_replicas": "0",
        "uuid": "5ijhh193Tr6y_hxaQrW9kg",
        "version": {
          "created": "5020199"
        },
        "provided_name": "logstash"
      }
    }
  }
}

下面是我的logstash配置

Below is my logstash configuration

input{
    beats{
        port=>5044
    }
}filter{
    grok{
        match=>{"message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] ALL AUDIT: User \[%{GREEDYDATA:user}\] is %{GREEDYDATA:activity} \[%{GREEDYDATA:filename}\] for transfer."}
    }
}output{
    elasticsearch{
        hosts=>"localhost:9200"
        index=> "logstash"
    }

样本数据

[2017-03-05 12:37:21.465] ALL AUDIT: User [user1] is opening file [filename1] for transfer.

但是当我通过filebeat> logstash> elasticsearch加载文件时在elasticsearch中,我的错误率低于

but when i am loading file through filebeat > logstash > elasticsearch in elasticsearch i am getting below error

org.elasticsearch.index.mapper.MapperParsingException: failed to parse [timestamp]
Caused by: java.lang.IllegalArgumentException: Invalid format: "2017-03-05T12:36:33.606" is malformed at "12:36:33.606"
    at org.joda.time.format.DateTimeParserBucket.doParseMillis(DateTimeParserBucket.java:187) ~[joda-time-2.9.5.jar:2.9.5]

请帮助我应配置哪种时间戳格式?

Please help , what timestamp format should i configure ?

推荐答案

在时间戳映射中,将格式表示为"format":"YYYY-MM-DD HH:mm:ss.SSS" 在这里,您通过节拍发送的格式不相同,请检查: 2017-03-05T12:36:33.606

In your timestamp mapping you indicate the format as "format": "YYYY-MM-DD HH:mm:ss.SSS" Here the format you are sending through beats is not the same, check: 2017-03-05T12:36:33.606

这就是Elastic抱怨格式的原因.您的格式应为:"YYYY-MM-DD'T'HH:mm:ss.SSS" (注意大写字母T)

That's why Elastic is complaining about the format. Your format should be: "YYYY-MM-DD'T'HH:mm:ss.SSS" (notice the capital T)

有关更多详细信息,请参阅文档:https://www.elastic.co/guide/zh-CN/elasticsearch/reference/current/mapping-date-format.html

See the documentation for more details: https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-date-format.html

这篇关于ElasticSearch索引编制问题,无法解析时间戳的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆