Py4JError:JVM中不存在SparkConf [英] Py4JError: SparkConf does not exist in the JVM

查看:212
本文介绍了Py4JError:JVM中不存在SparkConf的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在运行pyspark,但有时可能会不稳定.有几次它在此命令下崩溃

I am running pyspark but it can be unstable at times. There are couple of times it crashes at this command

spark_conf = SparkConf()

并显示以下错误消息

     File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/pyspark/conf.py", line 106, in __init__
self._jconf = _jvm.SparkConf(loadDefaults)
     File "/home/user1/spark/spark-1.5.1-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 772, in __getattr__
raise Py4JError('{0} does not exist in the JVM'.format(name))
     Py4JError: SparkConf does not exist in the JVM

任何主意是什么问题?谢谢您的帮助!

Any idea what is the problem? Thank you for your help!

推荐答案

SparkConf在pyspark上下文中不存在,请尝试:

SparkConf does not exist in the pyspark context, try:

from pyspark import SparkConf

在pyspark控制台或代码中.

in the pyspark console or code.

这篇关于Py4JError:JVM中不存在SparkConf的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆