Apache Spark:java.lang.NoSuchMethodError .rddToPairRDDFunctions [英] Apache Spark: java.lang.NoSuchMethodError .rddToPairRDDFunctions
问题描述
sbt package
运行得很好,但是在spark-submit
之后,我得到了错误:
sbt package
runs just fine, but after spark-submit
I get the error:
线程主"中的异常java.lang.NoSuchMethodError: org.apache.spark.SparkContext $ .rddToPairRDDFunctions(Lorg/apache/spark/rdd/RDD; Lscala/reflect/ClassTag; Lscala/reflect/ClassTag; Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions; 在SmokeStack $ .main(SmokeStack.scala:46)在 SmokeStack.main(SmokeStack.scala)位于 sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)位于 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在java.lang.reflect.Method.invoke(Method.java:498)在 org.apache.spark.deploy.SparkSubmit $ .org $ apache $ spark $ deploy $ SparkSubmit $$ runMain(SparkSubmit.scala:736) 在 org.apache.spark.deploy.SparkSubmit $ .doRunMain $ 1(SparkSubmit.scala:185) 在org.apache.spark.deploy.SparkSubmit $ .submit(SparkSubmit.scala:210) 在org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:124) 在org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.rddToPairRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions; at SmokeStack$.main(SmokeStack.scala:46) at SmokeStack.main(SmokeStack.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
这是违规行:
val sigCounts = rowData.map(row => (row("Signature"), 1)).countByKey()
rowData
是RDD映射[String,String].地图中的所有项目中都存在签名"键.
rowData
is an RDD Map[String, String]. "Signature" key exists in all items in the map.
我怀疑这可能是构建问题.以下是我的sbt文件:
I suspect this may be a build issue. Below is my sbt file:
name := "Example1"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
scalacOptions ++= Seq("-feature")
我是Scala的新手,所以也许导入不正确?我有:
I'm new to Scala so maybe the imports are not correct? I have:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import scala.io.Source
推荐答案
java.lang.NoSuchMethodError
通常表明,编译代码所依据的版本比运行时使用的库更高.
java.lang.NoSuchMethodError
is often an indication that the version the code was compiled against is on a higher version than the libraries used at runtime.
使用Spark,这意味着用于编译的Spark版本与(在计算机或群集上)部署的Spark版本不同.
With Spark, that means that the Spark version used to compile is different from the one deployed (on the machine or cluster).
在开发和运行时之间调整版本应该可以解决此问题.
Aligning the versions between development and runtime should solve this issue.
这篇关于Apache Spark:java.lang.NoSuchMethodError .rddToPairRDDFunctions的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!