py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled 在 JVM 中不存在 [英] py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

查看:1125
本文介绍了py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled 在 JVM 中不存在的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前使用的是 JRE:1.8.0_181,Python:3.6.4,spark:2.3.2

I am currently on JRE: 1.8.0_181, Python: 3.6.4, spark: 2.3.2

我正在尝试在 Python 中执行以下代码:

I am trying to execute following code in Python:

from pyspark.sql import SparkSession

spark = SparkSession.builder.appName('Basics').getOrCreate()

失败并出现以下错误:

spark = SparkSession.builder.appName('Basics').getOrCreate()回溯(最近一次调用最后一次):文件",第 1 行,在文件C:\Tools\Anaconda3\lib\site-packages\pyspark\sql\session.py",第 173 行,在 getOrCreate 中sc = SparkContext.getOrCreate(sparkConf)文件C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py",第 349 行,在 getOrCreate 中SparkContext(conf=conf 或 SparkConf())init 中的文件C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py",第 118 行conf、jsc、profiler_cls)文件C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py",第 195 行,在 _do_init 中self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc)getattr 中的文件C:\Tools\Anaconda3\lib\site-packages\py4j\java_gateway.py",第 1487 行"{0}.{1} 在 JVM 中不存在".format(self._fqn, name))py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled 在 JVM 中不存在

spark = SparkSession.builder.appName('Basics').getOrCreate() Traceback (most recent call last): File "", line 1, in File "C:\Tools\Anaconda3\lib\site-packages\pyspark\sql\session.py", line 173, in getOrCreate sc = SparkContext.getOrCreate(sparkConf) File "C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py", line 349, in getOrCreate SparkContext(conf=conf or SparkConf()) File "C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py", line 118, in init conf, jsc, profiler_cls) File "C:\Tools\Anaconda3\lib\site-packages\pyspark\context.py", line 195, in _do_init self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc) File "C:\Tools\Anaconda3\lib\site-packages\py4j\java_gateway.py", line 1487, in getattr "{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

有人知道这里有什么潜在问题吗?

Any one has any idea on what can be a potential issue here?

感谢这里的任何帮助或反馈.谢谢!

Appreciate any help or feedback here. Thank you!

推荐答案

如概述@pyspark 错误不存在初始化 SparkContext 时的 jvm 错误,添加 PYTHONPATH 环境变量(值为:

As outlined @ pyspark error does not exist in the jvm error when initializing SparkContext, adding PYTHONPATH environment variable (with value as:

%SPARK_HOME%\python;%SPARK_HOME%\python\lib\py4j--src.zip:%PYTHONPATH%,
- 只需检查您的 spark/python/lib 文件夹中的 py4j 版本)有助于解决此问题.

%SPARK_HOME%\python;%SPARK_HOME%\python\lib\py4j-<version>-src.zip:%PYTHONPATH%,
- just check what py4j version you have in your spark/python/lib folder) helped resolve this issue.

这篇关于py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled 在 JVM 中不存在的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆