本地类不兼容异常:从IDE运行独立运行spark时 [英] local class incompatible Exception: when running spark standalone from IDE
本文介绍了本地类不兼容异常:从IDE运行独立运行spark时的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我开始测试火花. 我在本地计算机上安装了spark,并使用一个工作程序运行本地集群.当我尝试通过如下设置sparconf从IDE执行作业时:
I begin to test spark. I installed spark on my local machine and run a local cluster with a single worker. when I tried to execute my job from my IDE by setting the sparconf as follows:
final SparkConf conf = new SparkConf().setAppName("testSparkfromJava").setMaster("spark://XXXXXXXXXX:7077");
final JavaSparkContext sc = new JavaSparkContext(conf);
final JavaRDD<String> distFile = sc.textFile(Paths.get("").toAbsolutePath().toString() + "dataSpark/datastores.json");*
我遇到了这个例外:
java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -5447855329526097695, local class serialVersionUID = -2221986757032131007
推荐答案
通过以下版本组合使它们全部正常工作
Got it all working with below combination of versions
已安装spark 1.6.2
Installed spark 1.6.2
通过bin/spark-submit --version验证
verify with bin/spark-submit --version
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
</dependency>
和
Scala 2.10.6和Java 8.
Scala 2.10.6 and Java 8.
请注意,它确实无效,并且在以下版本中存在类似的类不兼容问题
Note it did NOT work and have similar class incompatible issue with below versions
Scala 2.11.8和Java 8
Scala 2.11.8 and Java 8
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.6.2</version>
</dependency>
这篇关于本地类不兼容异常:从IDE运行独立运行spark时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文