Spark中Scala Seq行的NoSuchMethodError [英] NoSuchMethodError for Scala Seq line in Spark

查看:113
本文介绍了Spark中Scala Seq行的NoSuchMethodError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

尝试在Spark中运行纯Scala代码时出现错误,类似于以下帖子: this

他们的问题是他们使用了错误的Scala版本来编译其Spark项目.但是,我的是正确的版本.

我在AWS EMR集群上安装了Spark 1.6.0,以运行该程序.该项目是在安装了Scala 2.11且在所有依赖项和构建文件中列出的2.11的本地计算机上编译的,而没有任何对2.10的引用.

这是引发错误的确切行:

var fieldsSeq: Seq[StructField] = Seq()

这是确切的错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
at com.myproject.MyJob$.main(MyJob.scala:39)
at com.myproject.MyJob.main(MyJob.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

解决方案

EMR上的Spark 1.6仍使用Scala 2.10构建,所以是的,您遇到的问题与链接的帖子相同.为了在EMR上使用Spark,您当前必须使用Scala 2.10编译应用程序.

Spark已将其默认的Scala版本从Spark 2.0升级到2.11(将在未来几个月内发布),因此,一旦EMR支持Spark 2.0,我们将可能遵循此新的默认值,并使用Scala 2.11编译Spark. >

I am having an error when trying to run plain Scala code in Spark similar to these posts: this and this

Their problem was that they were using the wrong Scala version to compile their Spark project. However, mine is the correct version.

I have Spark 1.6.0 installed on an AWS EMR cluster to run the program. The project is compiled on my local machine with Scala 2.11 installed and 2.11 listed in all dependencies and build files without any references to 2.10.

This is the exact line that throws the error:

var fieldsSeq: Seq[StructField] = Seq()

And this is the exact error:

Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
at com.myproject.MyJob$.main(MyJob.scala:39)
at com.myproject.MyJob.main(MyJob.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

解决方案

Spark 1.6 on EMR is still built with Scala 2.10, so yes, you are having the same issue as in the posts you linked. In order to use Spark on EMR, you currently must compile your application with Scala 2.10.

Spark has upgraded their default Scala version to 2.11 as of Spark 2.0 (to be released within the next several months), so once EMR supports Spark 2.0, we will likely follow this new default and compile Spark with Scala 2.11.

这篇关于Spark中Scala Seq行的NoSuchMethodError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆