ValueError:无法使用pyspark在spark中一次运行多个SparkContext [英] ValueError: Cannot run multiple SparkContexts at once in spark with pyspark

查看:196
本文介绍了ValueError:无法使用pyspark在spark中一次运行多个SparkContext的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是使用Spark的新手,我尝试在pyspark上运行此代码

i am new in using spark , i try to run this code on pyspark

from pyspark import SparkConf, SparkContext
import collections

conf = SparkConf().setMaster("local").setAppName("RatingsHistogram")
sc = SparkContext(conf = conf)

但是他一直把这个错误的信息传达给我

but he till me this erore message

Using Python version 3.5.2 (default, Jul  5 2016 11:41:13)
SparkSession available as 'spark'.
>>> from pyspark import SparkConf, SparkContext
>>> import collections
>>> conf = SparkConf().setMaster("local").setAppName("RatingsHistogram")
>>> sc = SparkContext(conf = conf)



   Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "C:\spark\python\pyspark\context.py", line 115, in __init__
        SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
      File "C:\spark\python\pyspark\context.py", line 275, in _ensure_initialized
        callsite.function, callsite.file, callsite.linenum))
    ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by getOrCreate at C:\spark\bin\..\python\pyspark\shell.py:43
    >>>

我有spark 2.1.1和python 3.5.2版本,我搜索发现它是sc中的问题,他看不懂它,但是直到什么时候都没有,任何人都可以在这里找到帮助

i have version spark 2.1.1 and python 3.5.2 , i search and found it is problem in sc ,he could not read it but no when till why , any one have help here

推荐答案

您可以尝试

sc = SparkContext.getOrCreate();

这篇关于ValueError:无法使用pyspark在spark中一次运行多个SparkContext的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆