Spark 错误 - 不支持的类文件主要版本 [英] Spark Error - Unsupported class file major version

查看:57
本文介绍了Spark 错误 - 不支持的类文件主要版本的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在我的 Mac 上安装 Spark.我使用 home-brew 安装了 spark 2.4.0 和 Scala.我已经在我的 anaconda 环境中安装了 PySpark,并且正在使用 PyCharm 进行开发.我已导出到我的 bash 配置文件:

I'm trying to install Spark on my Mac. I've used home-brew to install spark 2.4.0 and Scala. I've installed PySpark in my anaconda environment and am using PyCharm for development. I've exported to my bash profile:

export SPARK_VERSION=`ls /usr/local/Cellar/apache-spark/ | sort | tail -1`
export SPARK_HOME="/usr/local/Cellar/apache-spark/$SPARK_VERSION/libexec"
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH

但是我无法让它工作.

我怀疑这是由于读取回溯的 java 版本.我真的很感激一些帮助解决这个问题.如果我可以提供任何有助于追溯的信息,请发表评论.

I suspect this is due to java version from reading the traceback. I would really appreciate some help fixed the issue. Please comment if there is any information I could provide that is helpful beyond the traceback.

我收到以下错误:

Traceback (most recent call last):
  File "<input>", line 4, in <module>
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/pyspark/rdd.py", line 816, in collect
    sock_info = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1257, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "/anaconda3/envs/coda/lib/python3.6/site-packages/py4j/protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: java.lang.IllegalArgumentException: Unsupported class file major version 55

推荐答案

编辑 Spark 3.0 支持 Java 11,因此您需要升级

Edit Spark 3.0 supports Java 11, so you'll need to upgrade

Spark 在 Java 8/11、Scala 2.12、Python 2.7+/3.4+ 和 R 3.1+ 上运行.从 Spark 3.0.0 开始,不推荐使用 8u92 版本之前的 Java 8

Spark runs on Java 8/11, Scala 2.12, Python 2.7+/3.4+ and R 3.1+. Java 8 prior to version 8u92 support is deprecated as of Spark 3.0.0



原答案

直到 Spark 支持 Java 11 或更高版本(希望在 最新文档中提及这一点)a> 时),您必须添加一个标志以将您的 Java 版本设置为 Java 8.

Until Spark supports Java 11, or higher (which would be hopefully be mentioned at the latest documentation when it is), you have to add in a flag to set your Java version to Java 8.

从 Spark 2.4.x 开始

As of Spark 2.4.x

Spark 在 Java 8、Python 2.7+/3.4+ 和 R 3.1+ 上运行.对于 Scala API,Spark 2.4.4 使用 Scala 2.12.您将需要使用兼容的 Scala 版本 (2.12.x)

Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.4 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)

在 Mac/Unix 上,请参阅 asdf-java 以安装不同的 Java

On Mac/Unix, see asdf-java for installing different Javas

在 Mac 上,我可以在 .bashrc 中执行此操作,

On a Mac, I am able to do this in my .bashrc,

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)

在 Windows 上,请以 Chocolately 的方式结帐,但实际上只需使用 WSL2 或 Docker 来运行 Spark.

On Windows, checkout Chocolately, but seriously just use WSL2 or Docker to run Spark.

您也可以在 spark-env.sh 中进行设置,而不是为整个配置文件设置变量.

You can also set this in spark-env.sh rather than set the variable for your whole profile.

当然,这一切都意味着您需要安装 Java 8除了现有的 Java 11

And, of course, this all means you'll need to install Java 8 in addition to your existing Java 11

这篇关于Spark 错误 - 不支持的类文件主要版本的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆