线程"main"中的异常java.lang.NoSuchMethodError:scala.Predef $ .refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps [英] Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps
问题描述
当我在终端中运行时:
sudo spark-submit --master local --class xxx.xxxx.xxx.xxxx.xxxxxxxxxxxxJob --conf 'spark.driver.extraJavaOptions=-Dconfig.resource=xxx.conf' /home/xxxxx/workspace/prueba/pruebas/target/scala-2.11/MiPrueba.jar
我收到以下错误:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at pureconfig.DurationUtils$.words(DurationUtils.scala:36)
at pureconfig.DurationUtils$.pureconfig$DurationUtils$$expandLabels(DurationUtils.scala:38)
at pureconfig.DurationUtils$$anonfun$2.apply(DurationUtils.scala:53)
at pureconfig.DurationUtils$$anonfun$2.apply(DurationUtils.scala:53)
at scala.collection.immutable.List.flatMap(List.scala:338)
at pureconfig.DurationUtils$.(DurationUtils.scala:53)
at pureconfig.DurationUtils$.(DurationUtils.scala)
at pureconfig.DurationReaders$class.$init$(BasicReaders.scala:114)
at pureconfig.ConfigReader$.(ConfigReader.scala:121)
at pureconfig.ConfigReader$.(ConfigReader.scala)
at xxx.xxxx.xxx.xxxx.config.package$Config$.load(package.scala:67)
at xxx.xxxx.xxx.xxxx.job.xxxxJob$class.main(XXXxxx.scala:23)
at xxx.xxxx.xxx.xxxx......Job$.main(Xxxxxxxxxxxx.scala:19)
at xxx.xxxx.xxx.xxxx..main(XXXXXXxxxxxxxx.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
内部定义:
version:="0.1"
version := "0.1"
scalaVersion:="2.11.11"
scalaVersion := "2.11.11"
libraryDependencies:
libraryDependencies:
val dependFullList = spark ++ hadoop ++ apisDownload ++ configuration
配置:
val configuration = Seq(
"com.github.pureconfig" %% "pureconfig" % "0.9.2",
"com.typesafe" % "config" % "1.3.1",
"org.lz4" % "lz4-java" % "1.4.1"
)
火花:
val spark = Seq(
"org.apache.spark" %% "spark-core" % Versions.spark % "provided" exclude("javax.jms", "jms"),
"org.apache.spark" %% "spark-sql" % Versions.spark % "provided",
"com.databricks" %% "spark-xml" % "0.4.1"
// https://mvnrepository.com/artifact/mrpowers/spark-daria
)
有什么想法吗?
推荐答案
您正在混合Scala版本. Spark 2.4.2不支持Scala 2.11.切换到Spark 2.4.0或用scala 2.12版本替换您的库.
You're mixing scala versions. Spark 2.4.2 doesn't support scala 2.11. Switch to Spark 2.4.0 or replace your libraries with scala 2.12 versions.
https://spark.apache.org/releases/spark-release-2-4-2.html
请注意,从2.4.1开始不推荐使用Scala 2.11.从2.4.2开始,针对Scala 2.12编译了预构建的便捷二进制文件. Spark仍在Maven Central中针对2.11和2.12交叉发布,并且可以从源代码针对2.11构建.
Note that Scala 2.11 support is deprecated from 2.4.1 onwards. As of 2.4.2, the pre-built convenience binaries are compiled for Scala 2.12. Spark is still cross-published for 2.11 and 2.12 in Maven Central, and can be built for 2.11 from source.
这篇关于线程"main"中的异常java.lang.NoSuchMethodError:scala.Predef $ .refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!