Spark 流 StreamingContext.start() - 启动接收器 0 时出错 [英] Spark streaming StreamingContext.start() - Error starting receiver 0
问题描述
我有一个使用 Spark Streaming 的项目,我正在使用spark-submit"运行它,但我遇到了这个错误:
I have a project that's using spark streaming and I'm running it with 'spark-submit' but I'm hitting this error:
15/01/14 10:34:18 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.lang.AbstractMethodError
at org.apache.spark.Logging$class.log(Logging.scala:52)
at org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66)
at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
at org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66)
at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
at org.apache.spark.scheduler.Task.run(Task.scala:54)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
这是错误来自的代码,一切正常,直到 ssc.start()
This is the code that the error is coming from, everything runs fine up until ssc.start()
val Array(zkQuorum, group, topics, numThreads) = args
val sparkConf = new SparkConf().setAppName("Jumbly_StreamingConsumer")
val ssc = new StreamingContext(sparkConf, Seconds(2))
ssc.checkpoint("checkpoint")
.
.
.
ssc.start()
ssc.awaitTermination()
我已经使用spark-submit"运行了 SparkPi 示例,它运行良好,所以我似乎无法弄清楚是什么导致了我的应用程序出现问题,我们将不胜感激.
I've run the SparkPi example using 'spark-submit' and it runs fine so I can't seem to figure out what's causing the problem on my application, any help would be really appreciated.
推荐答案
来自java.lang.AbstractMethod
的文档:
正常情况下,这个错误是被编译器捕捉到的;这个错误只能如果某个类的定义不兼容,则在运行时发生自从上次编译当前执行的方法以来发生了变化.
Normally, this error is caught by the compiler; this error can only occur at run time if the definition of some class has incompatibly changed since the currently executing method was last compiled.
这意味着编译和运行时依赖项之间存在版本不兼容.请确保调整这些版本以解决此问题.
This means that there's a version incompatibility between the compile and runtime dependencies. Make sure you align those versions to solve this issue.
这篇关于Spark 流 StreamingContext.start() - 启动接收器 0 时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!