无法访问Spark Web UI [英] Spark web UI unreachable

查看:461
本文介绍了无法访问Spark Web UI的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经在12个节点(以群集独立模式)上安装了spark2.0.0,当我启动它时,我得到了:

  ./ sbin / start-all.sh 




开始org.apache .spark.deploy.master.Master,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.master.Master-1- ibnb25.out



localhost192.17.0.17:ssh:无法解析主机名localhost192.17.0.17:名称或服务未知



192.17.0.20:启动org.apache.spark.deploy.worker.Worker,登录到/home/mbala/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName- org.apache.spark.deploy.worker.Worker-1-ibnb28.out



192.17.0.21:启动org.apache.spark.deploy.worker.Worker,正在记录日志到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb29.out



192.17.0.19:启动org.apache.spark.deploy.worker.Worker,登录到/主页/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb27.out



192.17.0.18:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName- org.apache.spark.deploy.worker.Worker-1-ibnb26.out



192.17.0.24:启动org.apache.spark.deploy.worker.Worker,正在记录日志到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb32.out



192.17.0.22:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark- mName-org.apache.spark.deploy.worker.Worker-1-ibnb30.out



192.17.0.25:启动org.apache.spark.deploy.worker.Worker ,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb33.out



192.17.0.28:启动org.apache.spark.deploy.worker。工作人员,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb36.out



192.17.0.27:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs /spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb35.out



192.17.0.17:启动org.apache.spark.deploy。 worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb25.out



192.17.0.26:启动org.apache.spark.deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7 /logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb34.out



192.17.0.23:启动org.apache.spark。 deploy.worker.Worker,登录到/home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb31.out


我已经设置了主端口Port = 8081,其IP = 192.17.0.17表示HOSTNAME = ibnb25,我从该主机启动了集群。



从我的本地计算机上,我使用此命令访问群集

  ssh mName@xx.xx.xx.xx 

以及当我想访问网络时通过本地计算机的UI,我使用了主服务器的IP地址(HOST ibnb25)

  192.17.0.17:8081 

但无法显示,因此我尝试使用用于访问群集的地址

  xx.xx.xx.xx:8081 

,但是浏览器上什么也没有显示.....怎么了? pleaseeee帮我

解决方案

您的/ etc / hosts文件似乎设置不正确。



您应该使用以下命令获取主机名和IP:

  hostname 
hostname -i

请确保主机名和IP之间有空格。



示例/ etc / hosts文件如下:

  192.17.0.17< hostname> 
192.17.0.17本地主机
<其他IP1> <其他主机名1>



<其他IP-n> <其他主机名-n>

确保/ etc / hosts文件中每个节点上的群集中都有所有IP主机条目。 / p>

对于FQDN,请阅读


i have installed spark2.0.0 on 12 nodes (in cluster standalone mode), when i launch it i get this :

./sbin/start-all.sh

starting org.apache.spark.deploy.master.Master, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.master.Master-1-ibnb25.out

localhost192.17.0.17: ssh: Could not resolve hostname localhost192.17.0.17: Name or service not known

192.17.0.20: starting org.apache.spark.deploy.worker.Worker, logging to /home/mbala/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb28.out

192.17.0.21: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb29.out

192.17.0.19: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb27.out

192.17.0.18: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb26.out

192.17.0.24: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb32.out

192.17.0.22: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb30.out

192.17.0.25: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb33.out

192.17.0.28: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb36.out

192.17.0.27: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb35.out

192.17.0.17: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb25.out

192.17.0.26: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb34.out

192.17.0.23: starting org.apache.spark.deploy.worker.Worker, logging to /home/mName/fer/spark-2.0.0-bin-hadoop2.7/logs/spark-mName-org.apache.spark.deploy.worker.Worker-1-ibnb31.out

i have already set the port o master Port=8081 and its IP=192.17.0.17 means the HOSTNAME=ibnb25, i launched the cluster from this host.

from my local machine i use this command to access to the cluster

 ssh mName@xx.xx.xx.xx 

and when i wanted to access to the web UI from my local machine, i used the IPaddress of the master (HOST ibnb25)

192.17.0.17:8081

but it couldn't be displayed, so i tried with the address that i use to access to the cluster

xx.xx.xx.xx:8081

but nothing is displaying on my browser..... what is wrong?? pleaseeee help me

解决方案

Your /etc/hosts file seems to be incorrectly set up.

You should get hostname and IP with following commands:

hostname
hostname -i

Make sure there is space between hostname and IP.

Sample /etc/hosts file looks like :

192.17.0.17  <hostname>
192.17.0.17  localhost
<Other IP1>  <other hostname1>
.
.
.
<Other IP-n>  <other hostname-n>

Make sure to have all IP host entries in cluster on each node in /etc/hosts file.

For FQDN read this.

这篇关于无法访问Spark Web UI的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆