线程“main"中的异常org.apache.spark.SparkException:此 JVM 中可能仅运行一个 SparkContext(请参阅 SPARK-2243) [英] Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243)
问题描述
当我尝试使用 cassandra 运行 Spark 应用程序时出现错误.
i am getting an error when i am trying to run a spark application with cassandra.
Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243).
我使用的是 spark 版本 1.2.0,很明显我在我的应用程序中只使用了一个 spark 上下文.但是每当我尝试添加以下代码用于流式传输时,都会出现此错误.
I am using spark version 1.2.0 and its clear that i am only using one spark context in my application. But whenever i try to add following code for streaming purpose am getting this error.
JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(
sparkConf, new Duration(1000));
推荐答案
您一次只能有一个 SparkContext,并且由于 StreamingContext 中有一个 SparkContext,因此您不能在同一代码中拥有单独的 Streaming 和 Spark Context.您可以做的是从 SparkContext 构建一个 StreamingContext 以便在您确实需要时可以访问两者.
You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.
使用这个构造函数JavaStreamingContext(sparkContext: JavaSparkContext, batchDuration: Duration)
这篇关于线程“main"中的异常org.apache.spark.SparkException:此 JVM 中可能仅运行一个 SparkContext(请参阅 SPARK-2243)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!