是否可以将Kafka与Google Cloud Dataflow一起使用 [英] is it possible to Use Kafka with Google cloud Dataflow
问题描述
我有两个问题
1)我想将Kafka与Google Cloud Dataflow Pipeline程序一起使用.在我的管道程序中,我想从kafka读取数据吗?
1) I want to use Kafka with Google cloud Dataflow Pipeline program. in my pipeline program I want to read data from kafka is it possible?
2)我创建了启用了BigQuery的实例,我想启用Pubsub,我该怎么办?
2) I created Instance with BigQuery enabled now i want to enable Pubsub how can i do ?
推荐答案
(1)Raghu提到的广告是在2016年中期通过KafkaIO
软件包将对Kafka进行读写的支持添加到了Apache Beam.您可以查看软件包的文档 [1]了解如何使用它.
(1) Ad mentioned by Raghu, support for writing to/reading from Kafka was added to Apache Beam in mid-2016 with the KafkaIO
package. You can check the package's documentation[1] to see how to use it.
(2)我不太清楚您的意思.您可以提供更多详细信息吗?
(2) I'm not quite sure what you mean. Can you provide more details?
[1] https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/io/kafka/KafkaIO.html
这篇关于是否可以将Kafka与Google Cloud Dataflow一起使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!