线程“main”中的异常java.lang.NoSuchMethodError:scala.Product。$ init $(Lscala / Product;) [英] Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)
问题描述
我收到此错误的原因是什么?最初,Scala的IDE插件是2.12.3。但是由于我正在使用Spark 2.2.0,我手动将其更改为Scala 2.11.11。
Any reason why I get this error ? Initially the IDE plugin for Scala was 2.12.3. But since I'm working with Spark 2.2.0, I manually changed it to Scala 2.11.11.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/09/19 12:08:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
at scala.xml.Null$.<init>(Null.scala:23)
at scala.xml.Null$.<clinit>(Null.scala)
at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:84)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:221)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:163)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
at sparkEnvironment$.<init>(Ticket.scala:33)
at sparkEnvironment$.<clinit>(Ticket.scala)
at Ticket$.main(Ticket.scala:39)
at Ticket.main(Ticket.scala)
推荐答案
你不能使用任何版本的scala版本 2.12
系列spark。
You can't use scala version 2.12
series with any version of spark.
您可以尝试使用Spark的 2.11
系列Scala,但要确保spark与相应的scala兼容版。 ie
You can try using the 2.11
series of Scala with Spark but make sure spark is compatible with corresponding scala version. i.e.
libraryDependencies + =org.apache.spark%spark-core_2.11%2.2.0
正如您在此依赖项中看到的那样
spark-core_2.11
与scala版本 2.11
。
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
as you can see in this dependency spark-core_2.11
is associated with scala version 2.11
.
或者您可以使用此依赖项 libraryDependencies + =org.apache.spark%%spark-core%2.2。 0
它会自动推断出scala版本
Or you can use this dependency libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
it will automatically infer the scala version
希望这是明确的
谢谢
这篇关于线程“main”中的异常java.lang.NoSuchMethodError:scala.Product。$ init $(Lscala / Product;)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!