Serilog HTTP Sink + Logstash:将Serilog消息数组拆分为单独的日志事件 [英] Serilog HTTP sink + Logstash: Splitting Serilog message array into individual log events

查看:719
本文介绍了Serilog HTTP Sink + Logstash:将Serilog消息数组拆分为单独的日志事件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们正在使用 Serilog HTTP接收器将消息发送到Logstash.但是HTTP消息正文是这样的:

We're using Serilog HTTP sink to send the messages to Logstash. But the HTTP message body is like this:

{
  "events": [
    {
      "Timestamp": "2016-11-03T00:09:11.4899425+01:00",
      "Level": "Debug",
      "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
      "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
      "Properties": {
        "Heartbeat": {
          "UserName": "Mike",
          "UserDomainName": "Home"
        },
        "Computer": "Workstation"
      }
    },
    {
      "Timestamp": "2016-11-03T00:09:12.4905685+01:00",
      "Level": "Debug",
      "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
      "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
      "Properties": {
        "Heartbeat": {
          "UserName": "Mike",
          "UserDomainName": "Home"
        },
        "Computer": "Workstation"
      }
    }
  ]
}

即.日志记录事件按数组批处理.可以一一发送消息,但那时仍然是一个单项数组.

ie. the logging events are batched in an array. It is possible to send the messages one by one, but it's still a one-item array then.

该事件随后在Kibana中显示为具有值的字段message

The event is then displayed in Kibana as having field message with value

{
  "events": [
    {
      // ...
    },
    {
      // ...
    }
  ]
}

即.从字面上看是从HTTP输入中获得的.

ie. literally what came from the HTTP input.

如何将events数组中的项目拆分为单独的日志记录事件,并将属性拉"到顶层,以便在ElasticSearch中有两个日志记录事件:

How can I split the items in the events array to individual logging events and "pull up" the properties to the top level so that I would have two logging events in ElasticSearch:

  "Timestamp": "2016-11-03T00:09:11.4899425+01:00",
  "Level": "Debug",
  "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
  "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
  "Properties": {
    "Heartbeat": {
      "UserName": "Mike",
      "UserDomainName": "Home"
    },
    "Computer": "Workstation"
  }


  "Timestamp": "2016-11-03T00:09:12.4905685+01:00",
  "Level": "Debug",
  "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
  "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
  "Properties": {
    "Heartbeat": {
      "UserName": "Mike",
      "UserDomainName": "Home"
    },
    "Computer": "Workstation"
  }


我尝试了Logstash json 拆分,但我无法进行可以.


I tried Logstash json and split, but I can't make it work.

推荐答案

升级到Logstash 5.0后 Val的解决方案停止工作由于事件API 中的更改:正在更新event.to_hash未反映在原始event中.对于Logstash 5.0+,必须使用event.get('field')event.set('field', value)访问器.

After upgrading to Logstash 5.0 Val's solution stopped working due to a change in the Event API: updating event.to_hash was not reflected in the original event. For Logstash 5.0+ event.get('field') and event.set('field', value) accessors must be used.

现在更新的解决方案是:

The updated solution is now:

input {
  http {
    port => 8080
    codec => json
  }
}

filter {
  split {
    field => "events"
  }
  ruby {
    code => "
      event.get('events').each do |k, v|
        event.set(k, v)
      end
    "
  }
  mutate {
    remove_field => [ "events" ]
  }
}

这篇关于Serilog HTTP Sink + Logstash:将Serilog消息数组拆分为单独的日志事件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆