火花提交群集模式不起作用 [英] spark-submit cluster mode is not working

查看:101
本文介绍了火花提交群集模式不起作用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在集群模式下启动独立的Spark驱动程序时出现错误.根据文档,请注意,Spark 1.2.1版本支持集群模式.但是,它目前对我来说无法正常工作.请帮助我解决导致Spark无法正常运行的问题.
我有3个节点Spark集群node1,node2和node 3

I am getting an error in launching the standalone Spark driver in cluster mode. As per the documentation, it is noted that cluster mode is supported in the Spark 1.2.1 release. However, it is currently not working properly for me. Please help me in fixing the issue(s) that are preventing the proper functioning of Spark.
I have 3 node spark cluster node1 , node2 and node 3

I running below command on node 1 for deploying driver

/usr/local/spark-1.2.1-bin-hadoop2.4/bin/spark-submit --class com.fst.firststep.aggregator.FirstStepMessageProcessor --master spark://ec2-xx-xx-xx-xx.compute-1.amazonaws.com:7077 --deploy-mode cluster --supervise file:///home/xyz/sparkstreaming-0.0.1-SNAPSHOT.jar /home/xyz/config.properties

driver gets launched on node 2 in cluster. but getting exception on node 2 that it is trying to bind to node 1 ip.

2015-02-26 08:47:32 DEBUG AkkaUtils:63 - In createActorSystem, requireCookie is: off 
2015-02-26 08:47:32 INFO  Slf4jLogger:80 - Slf4jLogger started 
2015-02-26 08:47:33 ERROR NettyTransport:65 - failed to bind to ec2-xx.xx.xx.xx.compute-1.amazonaws.com/xx.xx.xx.xx:0, shutting down Netty transport 
2015-02-26 08:47:33 WARN  Utils:71 - Service 'Driver' could not bind on port 0. Attempting port 1. 
2015-02-26 08:47:33 DEBUG AkkaUtils:63 - In createActorSystem, requireCookie is: off 
2015-02-26 08:47:33 ERROR Remoting:65 - Remoting error: [Startup failed] [ 
akka.remote.RemoteTransportException: Startup failed 
        at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:136) 
        at akka.remote.Remoting.start(Remoting.scala:201) 
        at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184) 
        at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618) 
        at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615) 
        at akka.actor.ActorSystemImpl._start(ActorSystem.scala:615) 
        at akka.actor.ActorSystemImpl.start(ActorSystem.scala:632) 
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:141) 
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:118) 
        at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121) 
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) 
        at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) 
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1765) 
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) 
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1756) 
        at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56) 
        at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:33) 
        at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala) 
Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: ec2-xx-xx-xx.compute-1.amazonaws.com/xx.xx.xx.xx:0 
        at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272) 
        at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393) 
        at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389) 
        at scala.util.Success$$anonfun$map$1.apply(Try.scala:206) 
        at scala.util.Try$.apply(Try.scala:161) 
        at scala.util.Success.map(Try.scala:206) 
kindly suggest

Thanks`enter code here`

推荐答案

不可能绑定到端口0.spark配置中有错误.具体看一下

It is not possible to bind to port 0. There is/are errors in your spark configuration. Specifically look at the

spark.webui.port

可能设置为0.

这篇关于火花提交群集模式不起作用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆