Apache Spark - UDF 似乎不适用于 spark-submit [英] Apache Spark - UDF doesn't seem to work with spark-submit

查看:40
本文介绍了Apache Spark - UDF 似乎不适用于 spark-submit的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我无法让 UDF 与 spark-submit 一起工作.我在使用 spark-shell 时没有任何问题.

I am unable to get UDF to work with spark-submit. I don't have any problem while using spark-shell.

请参见下面的错误信息、示例代码、build.sbt 和运行程序的命令

Please see below, the Error message, sample code, build.sbt and the command to run the program

感谢所有帮助!- 问候,文基

Will appreciate all the help! - Regards, Venki

Exception in thread "main" java.lang.NoSuchMethodError:
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)
Lscala/reflect/api/JavaUniverse$JavaMirror;
at TryUDFApp$.main(TryUDFApp.scala:20)

<小时>

代码:

/* TryUDFApp.scala */

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import org.apache.spark.sql.types._

object TryUDFApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
// print "Hello world"
println("Hello World -- I am trying to use UDF!")
// Create a UDF
val tryUDF = udf { (arg1: String, arg2: String) => arg2 + arg1 }
}
}

<小时>

build.sbt

name := "TryUDFApp Project"
version := "1.0"
scalaVersion := "2.11.7"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.6.1",
    "org.apache.spark" %% "spark-sql"  % "1.6.1"
)

<小时>

运行代码的命令:

$SPARK_HOME/bin/spark-submit --class "TryUDFApp" --master local[4] $TADIR/target/scala-2.11/tryudfapp-project_2.11-1.0.jar

echo $SPARK_HOME

/Users/venki/Spark/spark-1.6.1-bin-hadoop2.6

推荐答案

当您看到有关 Scala 库的 NoSuchMethodClassNotFound(在本例中为 scala.reflect.api.JavaUniverse.runtimeMirror),这通常意味着某个地方发生了 Scala 版本不匹配.

When you see a NoSuchMethod or ClassNotFound regarding a scala library (in this case, scala.reflect.api.JavaUniverse.runtimeMirror), this usually means a mismatch of scala versions happened somewhere.

您使用的是为 Scala 2.10 预先构建的 spark 1.6.1,但您的项目是 Scala 2.11.7,因此出现错误.

You're using spark 1.6.1, which comes pre-built for scala 2.10, but your project is scala 2.11.7, hence the error.

您的选择是:

  1. 将您的项目降级到 2.10
  2. 构建具有 2.11 支持的 Spark 1.6.1(来自源代码)
  3. 使用 Spark 2.0,它预先构建了 2.11 支持

这篇关于Apache Spark - UDF 似乎不适用于 spark-submit的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆