线程“main”中的异常org.apache.spark.SparkException:此JVM中只能运行一个SparkContext(请参阅SPARK-2243) [英] Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243)
问题描述
当我尝试使用cassandra运行spark应用程序时出现错误。
i am getting an error when i am trying to run a spark application with cassandra.
Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243).
我使用的是火花版1.2.0并且明确表示我只使用了一个火花上下文应用。但每当我尝试为流媒体目的添加以下代码时,我都会收到此错误。
I am using spark version 1.2.0 and its clear that i am only using one spark context in my application. But whenever i try to add following code for streaming purpose am getting this error.
JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(
sparkConf, new Duration(1000));
推荐答案
一次只能有一个SparkContext StreamingContext中有一个SparkContext,你不能在同一个代码中拥有单独的Streaming和Spark Context。你可以做的是从你的SparkContext构建一个StreamingContext,这样你就可以访问它们,如果你真的需要它。
You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.
使用此构造函数
JavaStreamingContext(sparkContext:JavaSparkContext,batchDuration:Duration)
这篇关于线程“main”中的异常org.apache.spark.SparkException:此JVM中只能运行一个SparkContext(请参阅SPARK-2243)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!