是否可以将 Kafka 与 Google Cloud Dataflow 一起使用 [英] is it possible to Use Kafka with Google cloud Dataflow
问题描述
我有两个问题
1) 我想将 Kafka 与 Google Cloud Dataflow Pipeline 程序一起使用.在我的管道程序中,我想从 kafka 读取数据有可能吗?
1) I want to use Kafka with Google cloud Dataflow Pipeline program. in my pipeline program I want to read data from kafka is it possible?
2) 我创建了启用 BigQuery 的实例,现在我想启用 Pubsub,我该怎么做?
2) I created Instance with BigQuery enabled now i want to enable Pubsub how can i do ?
推荐答案
(1) Raghu 提到的广告,在 2016 年年中使用 KafkaIO
向 Apache Beam 添加了对 Kafka 写入/读取的支持代码> 包.您可以查看 包的文档[1] 看看如何使用它.
(1) Ad mentioned by Raghu, support for writing to/reading from Kafka was added to Apache Beam in mid-2016 with the KafkaIO
package. You can check the package's documentation[1] to see how to use it.
(2) 我不太清楚你的意思.你能提供更多细节吗?
(2) I'm not quite sure what you mean. Can you provide more details?
[1] https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/io/kafka/KafkaIO.html
这篇关于是否可以将 Kafka 与 Google Cloud Dataflow 一起使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!