如何从flink访问/读取kafka主题数据? [英] How to access/read kafka topic data from flink?
本文介绍了如何从flink访问/读取kafka主题数据?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在尝试从 flink 读取 kafka 数据,由于我是 kafka 和 flink 的新手,我不知道如何连接它们.
I am trying to read kafka data from flink and as I am new to kafka and flink, I don't know how to connect them.
推荐答案
Flink 提供了 Kafka 连接器.为了从 Kafka 主题中读取数据,首先需要添加 Flink -Kafka 连接器依赖项.
Flink provides Kafka connector. In order read data from Kafka topics, first you need add Flink -Kafka connector dependency.
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.8_2.10</artifactId>
<version>1.1.3</version>
</dependency>
接下来您只需调用 Streaming 执行环境并添加 Kafka 源.这是一个示例
Next you simply invoke Streaming execution environment and add Kafka source. Here is a sample
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("zookeeper.connect", "localhost:2181");
properties.setProperty("group.id", "test");
DataStream<String> stream = env
.addSource(new FlinkKafkaConsumer08<>("topic", new SimpleStringSchema(),properties))
.print();
就是这样.您已准备好使用来自 Kafka 主题的数据.
That's it. You are all set to consume data from Kafka topic.
完整代码可在此下载link
这篇关于如何从flink访问/读取kafka主题数据?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文