如何从flink访问/读取kafka主题数据? [英] How to access/read kafka topic data from flink?

查看:746
本文介绍了如何从flink访问/读取kafka主题数据?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试从flink读取kafka数据,由于我是kafka和flink的新手,所以我不知道如何连接它们.

I am trying to read kafka data from flink and as I am new to kafka and flink, I don't know how to connect them.

推荐答案

Flink提供了Kafka连接器.为了从Kafka主题中读取数据,首先需要添加Flink -Kafka连接器依赖项.

Flink provides Kafka connector. In order read data from Kafka topics, first you need add Flink -Kafka connector dependency.

<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-connector-kafka-0.8_2.10</artifactId>
   <version>1.1.3</version>
</dependency>

接下来,您只需调用Streaming执行环境并添加Kafka源.这是一个示例

Next you simply invoke Streaming execution environment and add Kafka source. Here is a sample

Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("zookeeper.connect", "localhost:2181");  
properties.setProperty("group.id", "test");
DataStream<String> stream = env
 .addSource(new FlinkKafkaConsumer08<>("topic", new SimpleStringSchema(),properties))
.print();

就是这样.您已经准备好使用Kafka主题中的数据.

That's it. You are all set to consume data from Kafka topic.

完整的代码可从以下 查看全文

登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆