与Kafka Consumer的Flink不起作用 [英] Flink with Kafka Consumer doesn't work
问题描述
我想对Spark与Flink进行基准测试,为此,我要进行一些测试.但是Flink在Kafka上不起作用,而在Spark上效果很好.
I want to benchmark Spark vs Flink, for this purpose I am making several tests. However Flink doesn't work with Kafka, meanwhile with Spark works perfect.
代码很简单:
val env: StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment
val properties = new Properties()
properties.setProperty("bootstrap.servers", "localhost:9092")
properties.setProperty("group.id", "myGroup")
println("topic: "+args(0))
val stream = env.addSource(new FlinkKafkaConsumer09[String](args(0), new SimpleStringSchema(), properties))
stream.print
env.execute()
我使用具有相同主题的kafka 0.9.0.0(在消费者[Flink]和生产者[Kafka控制台]中),但是当我将jar发送到集群时,什么也没发生:
I use kafka 0.9.0.0 with the same topics (in consumer[Flink] and producer[Kafka console]), but when I send my jar to the cluster, nothing happens:
可能会发生什么?
推荐答案
您的stream.print不会在flink的控制台中打印,它将写入flink0.9/logs/recentlog.否则,您可以添加自己的记录器以确认输出.
Your stream.print will not print in console on flink .It will write to flink0.9/logs/recentlog. Other-wise you can add your own logger for confirming output.
这篇关于与Kafka Consumer的Flink不起作用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!