将Logstash中的时间戳记时区转换为输出索引名称 [英] Convert timestamp timezone in Logstash for output index name

查看:2789
本文介绍了将Logstash中的时间戳记时区转换为输出索引名称的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在我的场景中,Logstash收到的系统日志行的timestamp是UTC,我们在弹性搜索输出中使用事件timestamp:

  output {
elasticsearch {
embedded => false
host => localhost
port => 9200
协议=> http
cluster => 'elasticsearch'
index => syslog - %{+ YYYY.MM.dd}
}
}



<我的问题是在UTC的午夜,Logstash在时区结束前(GMT-4 => America / Montreal)发送日志到不同的索引,并且索引在20h(8h PM)之后没有日志,因为时间戳是UTC。



我们已经完成了一个工作,以转换时区,但我们遇到了显着的性能下降:

  filter {
mutate {
add_field => {
#创建一个字符串值为UTC事件日期的新字段
timestamp_zoned=> %{@ timestamp}
}
}

date {
#解析UTC字符串值,并将其转换为我的时区到新字段
match => [timestamp_zoned,yyyy-MM-dd HH:mm:ss Z]
timezone => 美国/蒙特利尔
locale => en
remove_field => [timestamp_zoned]
target => timestamp_zoned_obj
}

ruby​​ {
#将分区日期输出到新字段
code => event ['index_day'] = event ['timestamp_zoned_obj']。strftime('%Y.%m。%d')
remove_field => [timestamp_zoned_obj]
}
}

输出{
elasticsearch {
embedded => false
host => localhost
port => 9200
协议=> http
cluster => 'elasticsearch'
#使用字符串值
index => syslog - %{index_day}
}
}

有没有方法来优化这个配置?

解决方案

这是优化配置,请尝试并测试性能。



您不需要使用 mutate date 。直接使用 ruby​​ 插件。

 输入{
stdin {
}
}

过滤器{
ruby​​ {
code =>
event ['index_day'] = event ['@ timestamp']。localtime.strftime('%Y.%m。%d')

}
}

output {
stdout {codec => rubydebug}
}

示例输出:

  {
message=> test,
@version=> 1,
@timestamp=> 2015-03-30T05:27:06.310Z,
host=> BEN_LIM,
index_day=> 2015.03.29
}


In my scenario, the "timestamp" of the syslog lines Logstash receives is in UTC and we use the event "timestamp" in the Elasticsearch output:

output {
    elasticsearch {
        embedded => false
        host => localhost
        port => 9200
        protocol => http
        cluster => 'elasticsearch'
        index => "syslog-%{+YYYY.MM.dd}"
    }
}

My problem is that at UTC midnight, Logstash sends log to different index before the end of the day in out timezone (GMT-4 => America/Montreal) and the index has no logs after 20h (8h PM) because of the "timestamp" being UTC.

We've done a work arround to convert the timezone but we experience a significant performance degradation:

filter {
    mutate {
        add_field => {
            # Create a new field with string value of the UTC event date
            "timestamp_zoned" => "%{@timestamp}"
        }
    }

    date {
        # Parse UTC string value and convert it to my timezone into a new field
        match => [ "timestamp_zoned", "yyyy-MM-dd HH:mm:ss Z" ]
        timezone => "America/Montreal"
        locale => "en"
        remove_field => [ "timestamp_zoned" ]
        target => "timestamp_zoned_obj"
    }

    ruby {
        # Output the zoned date to a new field
        code => "event['index_day'] = event['timestamp_zoned_obj'].strftime('%Y.%m.%d')"
        remove_field => [ "timestamp_zoned_obj" ]
    }
}

output {
    elasticsearch {
        embedded => false
        host => localhost
        port => 9200
        protocol => http
        cluster => 'elasticsearch'
        # Use of the string value
        index => "syslog-%{index_day}"
    }
}

Is there a way to optimize this config?

解决方案

This is the optimize config, please have a try and test for the performance.

You no need to use mutate and date plugin. Use ruby plugin directly.

input {
    stdin {
    }
}

filter {
    ruby {
            code => "
                    event['index_day'] = event['@timestamp'].localtime.strftime('%Y.%m.%d')
            "
    }
}

output {
    stdout { codec => rubydebug }
}

Example output:

{
       "message" => "test",
      "@version" => "1",
    "@timestamp" => "2015-03-30T05:27:06.310Z",
          "host" => "BEN_LIM",
     "index_day" => "2015.03.29"
}

这篇关于将Logstash中的时间戳记时区转换为输出索引名称的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆