为什么将Spark应用程序提交给Mesos失败,并显示“无法解析主URL:'mesos://localhost:5050'"? [英] Why does submitting a Spark application to Mesos fail with "Could not parse Master URL: 'mesos://localhost:5050'"?
问题描述
当我尝试向Mesos集群提交Spark应用程序时,出现以下异常:
I'm getting the following exception when I'm trying to submit a Spark application to a Mesos cluster:
17/01/31 17:04:21 WARN NativeCodeLoader:无法在适用的平台上使用内置的Java类为您的平台加载本机Hadoop库. 17/01/31 17:04:22错误SparkContext:初始化SparkContext时出错. org.apache.spark.SparkException:无法在org.apache.spark.SparkContext $ .org $ apache $ spark $ SparkContext $$ createTaskScheduler(SparkContext.scala:2550)处解析主URL:'mesos://localhost:5050'在org.apache.spark.SparkContext.(SparkContext.scala:501)
17/01/31 17:04:21 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/31 17:04:22 ERROR SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Could not parse Master URL: 'mesos://localhost:5050' at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2550) at org.apache.spark.SparkContext.(SparkContext.scala:501)
推荐答案
您可能使用了错误的命令来构建Spark,例如缺少-Pmesos
.自Spark 2.1.0起,您应该使用./build/mvn -Pmesos -DskipTests clean package
进行构建.
You probably used a wrong command to build Spark, e.g., missing -Pmesos
. You should build it using ./build/mvn -Pmesos -DskipTests clean package
since Spark 2.1.0.
这篇关于为什么将Spark应用程序提交给Mesos失败,并显示“无法解析主URL:'mesos://localhost:5050'"?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!