斯卡拉美元的火花C $Ç抛出异常 [英] scala code throw exception in spark
问题描述
我是新来Scala和火花。今天我试着写一些code,并让它在火花运行,但有一个例外。
I am new to scala and spark. Today I tried to write some code, and let it run on spark, but got an exception.
在本地本阶code工作
this code work in local scala
import org.apache.commons.lang.time.StopWatch
import org.apache.spark.{SparkConf, SparkContext}
import scala.collection.mutable.ListBuffer
import scala.util.Random
def test(): List[Int] = {
val size = 100
val range = 100
var listBuffer = new ListBuffer[Int] // here throw an exception
val random = new Random()
for (i <- 1 to size)
listBuffer += random.nextInt(range)
listBuffer.foreach(x => println(x))
listBuffer.toList
}
但是当我把这个code到火花,它抛出一个异常,说:
but when I put this code into spark, it throw an exception says:
15/01/01 14:06:17 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
at com.tudou.sortedspark.Sort$.test(Sort.scala:35)
at com.tudou.sortedspark.Sort$.sort(Sort.scala:23)
at com.tudou.sortedspark.Sort$.main(Sort.scala:14)
at com.tudou.sortedspark.Sort.main(Sort.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
如果我注释掉低于code,在火花code工作
if I comment out the below code, the code work in spark
for (i <- 1 to size)
有人可以解释为什么,请。
can someone explain why, please.
推荐答案
感谢@Imm,我已经解决了这个问题。根本原因是,我的本地Scala是2.11.4,但我的火花集群在1.2.0版本上运行。 1.2版本的火花由2.10斯卡拉编译。
Thanks @Imm, I have solved this issue. The root cause is that my local scala is 2.11.4, but my spark cluster is running at 1.2.0 version. The 1.2 version of spark was compiled by 2.10 scala.
所以,解决的办法是2.10斯卡拉编译本地code,并上传编译jar放到火花。一切工作正常。
So the solution is compile local code by 2.10 scala, and upload the compiled jar into spark. Everything works fine.
这篇关于斯卡拉美元的火花C $Ç抛出异常的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!