怎么把Suricata日志发送到Kafka? [英] How to send Suricata log to Kafka?

查看:511
本文介绍了怎么把Suricata日志发送到Kafka?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

根据文档 https://suricata.readthedocs.io/.

我尝试通过添加以下内容来更改suricata.yaml中的某些配置:

I try to change some configuration in suricata.yaml by adding:

- alert-json-log:
      enabled: yes
      filetype: kafka
      kafka:
        brokers: > 
         xxx-kafka-online003:9092,
         xxx-kafka-online004:9092,
         xxx-kafka-online005:9092,
         xxx-kafka-online006:9092,
         xxx-kafka-online007:9092
        topic: nsm_event
        partitions: 5
      http: yes

接下来,我运行Suricata,并收到错误消息 alert-json-log.filetype的无效条目.预期为常规"(默认),"unix_stream","pcie"或"unix_dgram"

Next I run Suricata, and receive the error Invalid entry for alert-json-log.filetype. Expected "regular" (default), "unix_stream", "pcie" or "unix_dgram"

我不知道要在Suricata上进行配置以启用将日志发送到Kafka主题的功能. 请帮忙.

I don't know to configure on Suricata to enable sending log to Kafka topics. Please help.

推荐答案

我没有看到Kafka被列为输出类型,因此不,没有"

I don't see Kafka listed as an output type, therefore "no, there is not"

参考文档: https://suricata.readthedocs. io/en/suricata-5.0.2/output/index.html

此外,由于Kafka不是HTTP服务

Plus, I'm not sure I understand what you expect http: yes to do since Kafka is not an HTTP service

可能要做的是设置为filetype: unix_stream,然后假设它是Syslog,并且您可以添加诸如Kafka Connect或Fluentd或Logstash之类的另一项服务,以将该数据路由到Kafka.

What you could do is set filetype: unix_stream, then I assume that is Syslog, and you can add another service like Kafka Connect or Fluentd or Logstash to route that data to Kafka.

换句话说,服务不需要与Kafka集成.存在许多替代方法来读取文件或stdout/stderr/syslog

In other words, services don't need to integrate with Kafka. Plenty of alternatives exist to read files or stdout/stderr/syslog streams

这篇关于怎么把Suricata日志发送到Kafka?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆