Spark Streaming Kafka集成直接方法EOFException [英] Spark Streaming Kafka Integration direct Approach EOFException
问题描述
当我运行火花流示例org.apache.spark.examples.streaming.JavaDirectKafkaWordCount
时,我发现了一个EOFException
追随者,我该如何解决
when i run spark streaming example org.apache.spark.examples.streaming.JavaDirectKafkaWordCount
,i caught an EOFException
follow,how can I resolve it
Exception in thread "main" org.apache.spark.SparkException: java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
java.io.EOFException: Received -1 when reading from channel, socket has likely been closed.
at org.apache.spark.streaming.kafka.KafkaUtils$$anonfun$createDirectStream$2.apply(KafkaUtils.scala:413)
at org.apache.spark.streaming.kafka.KafkaUtils$$anonfun$createDirectStream$2.apply(KafkaUtils.scala:413)
at scala.util.Either.fold(Either.scala:97)
at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:412)
at org.apache.spark.streaming.kafka.KafkaUtils$.createDirectStream(KafkaUtils.scala:528)
at org.apache.spark.streaming.kafka.KafkaUtils.createDirectStream(KafkaUtils.scala)
推荐答案
直接流在下面使用了一个低级别的Kafka使用者,因此需要提供经纪人列表.您很有可能没有,所以您可能想以bhost1:9092,bhost2:9092,...,bhostN:9092
形式设置metadata.broker.list
属性.
The direct stream uses a low level Kafka consumer underneath therefore one needs to provide the list of brokers. Most likely you haven't so you may want to set the metadata.broker.list
property in the form bhost1:9092,bhost2:9092,...,bhostN:9092
.
另请参阅 Kafka SimpleConsumer无法连接zookeeper:从频道读取时收到-1.
See also Kafka SimpleConsumer cannot connect to zookeeper : Received -1 when reading from channel.
这篇关于Spark Streaming Kafka集成直接方法EOFException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!