无法将作业提交到Spark集群(集群模式) [英] Unable to submit jobs to spark cluster (cluster-mode)
本文介绍了无法将作业提交到Spark集群(集群模式)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
Spark版本1.3.0
Spark version 1.3.0
在集群模式下将作业提交到Spark集群时发生错误
Error while submitting jobs to spark cluster in cluster mode
./spark-submit --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount --deploy-mode cluster wordcount-0.1.jar 172.20.5.174:9092,172.20.9.50:9092,172.20.7.135:9092 log
收益:
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Running Spark using the REST application submission protocol.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
15/04/14 16:41:10 INFO StandaloneRestClient: Submitting a request to launch an application in spark://172.20.9.151:7077.
Warning: Master endpoint spark://172.20.9.151:7077 was not a REST server. Falling back to legacy submission gateway instead.
15/04/14 16:41:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Sending launch command to spark://172.20.9.151:7077
Error connecting to master spark://172.20.9.151:7077 (akka.tcp://sparkMaster@172.20.9.151:7077), exiting.
推荐答案
默认情况下,Master Spark REST URL在端口6066上. 因此,您应该将此作为您的主要端点:spark://172.20.9.151:6066.
The Master Spark REST URL is on port 6066 by default. So you should see this as your Master endpoint: spark://172.20.9.151:6066.
如果您转到Spark Web控制台( http://master:8080 ),您将获得集群各个端点的详细信息
If you go the to Spark web console (http://master:8080) you will get the details of the various endpoints of your cluster.
这篇关于无法将作业提交到Spark集群(集群模式)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文