使用JavaStreamingContext.getOrCreate()的SparkException:此JVM中可能仅运行一个SparkContext [英] SparkException using JavaStreamingContext.getOrCreate(): Only one SparkContext may be running in this JVM

查看:158
本文介绍了使用JavaStreamingContext.getOrCreate()的SparkException:此JVM中可能仅运行一个SparkContext的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

此问题,我得到了提示,应使用getOrCreate习惯用法来避免此问题.但是尝试:

Related to this question, I got the tip that the getOrCreate idiom should be used to avoid this issues. But trying:

JavaStreamingContextFactory contextFactory = new JavaStreamingContextFactory() {

    @Override
    public JavaStreamingContext create() {
        final SparkConf conf = new SparkConf().setAppName(NAME);
        return new JavaStreamingContext(conf, Durations.seconds(BATCH_SPAN));
    }

};

final JavaStreamingContext context = JavaStreamingContext.getOrCreate("/tmp/"+NAME, contextFactory);

我仍然得到:

Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
org.example.ExamplePipeline$1.create(ExamplePipeline.java:56)
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:706)
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:705)
scala.Option.getOrElse(Option.scala:120)
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:705)
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
org.example.ExamplePipeline.createExecutionContext(ExamplePipeline.java:70)
org.example.ExamplePipeline.exec(ExamplePipeline.java:116)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1702)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1641)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1570)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2257)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2239)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2239)
    at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2325)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:2197)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
    at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
    at org.example.ExamplePipeline$1.create(ExamplePipeline.java:56)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:706)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:705)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:705)
    at org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
    at org.example.ExamplePipeline.createExecutionContext(ExamplePipeline.java:70)
    at org.example.ExamplePipeline.exec(ExamplePipeline.java:116)
    at org.example.ExamplePipeline.main(ExamplePipeline.java:157)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:786)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:123)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我认为做错了什么?

谢谢.

推荐答案

根据这个问题我认为这是我应该怎么做:

According to this question I think this is how I should do it:

SparkConf conf = new SparkConf().setAppName(NAME);
JavaSparkContext ctx = JavaSparkContext.fromSparkContext(SparkContext.getOrCreate(conf));
JavaStreamingContext context = new JavaStreamingContext(ctx, Durations.seconds(BATCH_SPAN));

对吗?

这篇关于使用JavaStreamingContext.getOrCreate()的SparkException:此JVM中可能仅运行一个SparkContext的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆