Kafka-Connect 与 Filebeat &日志存储 [英] Kafka-Connect vs Filebeat & Logstash

查看:57
本文介绍了Kafka-Connect 与 Filebeat &日志存储的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我希望从 Kafka 消费并将数据保存到 Hadoop 和 Elasticsearch.我目前看到了两种方法:使用 Filebeat 从 Kafka 消费并将其发送到 ES 和使用 Kafka-Connect 框架.有一个 Kafka-Connect-HDFS 和 Kafka-Connect-Elasticsearch 模块.

I'm looking to consume from Kafka and save data into Hadoop and Elasticsearch. I've seen 2 ways of doing this currently: using Filebeat to consume from Kafka and send it to ES and using Kafka-Connect framework. There is a Kafka-Connect-HDFS and Kafka-Connect-Elasticsearch module.

我不确定使用哪个来发送流数据.虽然我认为如果我想在某个时候从 Kafka 获取数据并将其放入 Cassandra 中,我可以为此使用 Kafka-Connect 模块,但 Filebeat 不存在这样的功能.

I'm not sure which one to use to send streaming data. Though I think that if I want at some point to take data from Kafka and place it into Cassandra I can use a Kafka-Connect module for that but no such feature exists for Filebeat.

推荐答案

Kafka Connect 可以处理流数据,并且更加灵活.如果您只是想要弹性,Filebeat 是日志源的干净集成.但是,如果您要从 Kafka 转到多个不同的接收器,那么 Kafka Connect 可能就是您想要的.我建议您查看连接器集线器,以查看当前可供您使用的一些开源连接器示例 http://www.confluent.io/product/connectors/

Kafka Connect can handle streaming data and is a bit more flexible. If you are just going to elastic, Filebeat is a clean integration for log sources. However, if you are going from Kafka to a number of different sinks, Kafka Connect is probably what you want. I'd recommend checking out the connector hub to see some examples of open source connectors at your disposal currently http://www.confluent.io/product/connectors/

这篇关于Kafka-Connect 与 Filebeat &日志存储的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆