Pyspark错误-不支持的类文件主要版本55 [英] Pyspark error - Unsupported class file major version 55

查看:723
本文介绍了Pyspark错误-不支持的类文件主要版本55的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

FIX:

要解决此问题,我编辑了bash_profile,以确保将Java 1.8用作全局默认值,如下所示:

To fix this issue I edited the bash_profile to ensure java 1.8 is used as the global default as follows:

touch ~/.bash_profile; open ~/.bash_profile

添加

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8) 

并保存在文本编辑中.

更新

由于来自Oracle的许可证更改,上述修复程序可能无法正常工作,并且您可能会通过brew安装安装时遇到问题.为了安装Java 8,您可能需要遵循指南.

Due to license changes from Oracle the above fix might not work and you may encounter issues installing via brew. In order to install Java 8 you may need to follow this guide.

问题:

我正在尝试在Mac上安装Spark.我已经使用自制软件安装了spark 2.4.0和Scala.我已经在Anaconda环境中安装了PySpark,并且正在使用PyCharm进行开发.我已经导出到我的bash个人资料:

I'm trying to install Spark on my Mac. I've used home-brew to install spark 2.4.0 and Scala. I've installed PySpark in my anaconda environment and am using PyCharm for development. I've exported to my bash profile:

export SPARK_VERSION=`ls /usr/local/Cellar/apache-spark/ | sort | tail -1`
export SPARK_HOME="/usr/local/Cellar/apache-spark/$SPARK_VERSION/libexec"
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH

但是我无法使其正常工作.

However I'm unable to get it to work.

我怀疑这是由于Java版本读取了回溯.我非常感谢您为解决此问题提供的帮助.请提供我的信息,以提供除追溯​​之外有用的信息.

I suspect this is due to java version from reading the traceback. I would really appreciate some help fixed the issue. Please comment if there is any information I could provide that is helpful beyond the traceback.

我收到以下错误:

Traceback (most recent call last):
  File "<input>", line 4, in <module>
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/pyspark/rdd.py", line 816, in collect
    sock_info = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1257, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/py4j/protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: java.lang.IllegalArgumentException: Unsupported class file major version 55

推荐答案

直到Spark支持Java 11(希望在最新文档),则必须添加一个标志,以将Java版本设置为Java8.

Until Spark supports Java 11 (which would be hopefully be mentioned at the latest documentation when it is), you have to add in a flag to set your Java version to Java 8.

自Spark 2.4.x起

As of Spark 2.4.x

Spark可在 Java 8 ,Python 2.7 +/3.4 +和R 3.1+上运行.对于Scala API,Spark 2.4.4使用Scala 2.12.您将需要使用兼容的Scala版本(2.12.x)

Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.4 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)

在Mac上,我可以在.bashrc

On a Mac, I am able to do this in my .bashrc,

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)

您也可以在spark-env.sh中进行设置,而不是为整个配置文件设置变量.

You can also set this in spark-env.sh rather than set the variable for your whole profile.

除了现有的Java 11外,您还需要安装Java 8

And you'll need to install Java 8 in addition to your existing Java 11

这篇关于Pyspark错误-不支持的类文件主要版本55的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆