Spark错误-不支持的类文件主要版本 [英] Spark Error - Unsupported class file major version

查看:129
本文介绍了Spark错误-不支持的类文件主要版本的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在Mac上安装Spark.我已经使用自制软件安装了spark 2.4.0和Scala.我已经在Anaconda环境中安装了PySpark,并且正在使用PyCharm进行开发.我已经导出到我的bash个人资料:

export SPARK_VERSION=`ls /usr/local/Cellar/apache-spark/ | sort | tail -1`
export SPARK_HOME="/usr/local/Cellar/apache-spark/$SPARK_VERSION/libexec"
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH

但是我无法使其正常工作.

我怀疑这是由于Java版本读取了回溯.我非常感谢您为解决此问题提供的帮助.请提供我的信息,以提供除追溯​​之外有用的信息.

我收到以下错误:

Traceback (most recent call last):
  File "<input>", line 4, in <module>
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/pyspark/rdd.py", line 816, in collect
    sock_info = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1257, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/py4j/protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: java.lang.IllegalArgumentException: Unsupported class file major version 55

解决方案

编辑 Spark 3.0支持Java 11,因此您需要升级

Spark可在Java 8/11,Scala 2.12,Python 2.7 +/3.4 +和R 3.1+上运行.从Spark 3.0.0起不推荐使用Java 8之前的版本8u92



原始答案

直到Spark支持Java 11或更高版本(希望在最新文档中提及(如果是),则必须添加一个标志,以将Java版本设置为Java 8.

自Spark 2.4.x起

Spark可在 Java 8 ,Python 2.7 +/3.4 +和R 3.1+上运行.对于Scala API,Spark 2.4.4使用Scala 2.12.您将需要使用兼容的Scala版本(2.12.x)

在Mac/Unix上,有关安装其他Java的信息,请参见 asdf-java

在Mac上,我可以在.bashrc中完成此操作,

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)

在Windows上,以Chocolately方式签出,但请认真使用WSL2或Docker运行Spark.


您也可以在spark-env.sh中进行设置,而不是为整个配置文件设置变量.

当然,这一切都意味着除了现有的Java 11 之外,您还需要安装Java 8

I'm trying to install Spark on my Mac. I've used home-brew to install spark 2.4.0 and Scala. I've installed PySpark in my anaconda environment and am using PyCharm for development. I've exported to my bash profile:

export SPARK_VERSION=`ls /usr/local/Cellar/apache-spark/ | sort | tail -1`
export SPARK_HOME="/usr/local/Cellar/apache-spark/$SPARK_VERSION/libexec"
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH

However I'm unable to get it to work.

I suspect this is due to java version from reading the traceback. I would really appreciate some help fixed the issue. Please comment if there is any information I could provide that is helpful beyond the traceback.

I am getting the following error:

Traceback (most recent call last):
  File "<input>", line 4, in <module>
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/pyspark/rdd.py", line 816, in collect
    sock_info = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1257, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/py4j/protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: java.lang.IllegalArgumentException: Unsupported class file major version 55

解决方案

Edit Spark 3.0 supports Java 11, so you'll need to upgrade

Spark runs on Java 8/11, Scala 2.12, Python 2.7+/3.4+ and R 3.1+. Java 8 prior to version 8u92 support is deprecated as of Spark 3.0.0



Original answer

Until Spark supports Java 11, or higher (which would be hopefully be mentioned at the latest documentation when it is), you have to add in a flag to set your Java version to Java 8.

As of Spark 2.4.x

Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.4 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)

On Mac/Unix, see asdf-java for installing different Javas

On a Mac, I am able to do this in my .bashrc,

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)

On Windows, checkout Chocolately, but seriously just use WSL2 or Docker to run Spark.


You can also set this in spark-env.sh rather than set the variable for your whole profile.

And, of course, this all means you'll need to install Java 8 in addition to your existing Java 11

这篇关于Spark错误-不支持的类文件主要版本的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆