在配置多个节点后,hdfs dfs ls不工作 [英] hdfs dfs ls not working after multiple nodes configured

查看:133
本文介绍了在配置多个节点后,hdfs dfs ls不工作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我开始关注在线教程,在单个本地虚拟机上配置多个ndoes。这里是主节点上的主机:

  127.0.0.1 localhost 
192.168.96.132 hadoop
192.168。 96.135 hadoop1
192.168.96.136 hadoop2

ssh:ALL:allow
sshd:ALL:allow

这是以前的命令: hdfs dfs -ls



  ls:从hadoop / 192.168.96.132调用hadoop:9000连接失败异常:
java.net.ConnectException:连接被拒绝;
欲了解更多详情,请参阅:http://wiki.apache.org/hadoop/ConnectionRefused

我的配置有什么问题?我应该在哪里检查并纠正它?



非常感谢。

首先尝试
ping hadoop,
ping hadoop1和
ping hadoop2。
例如:ping hadoop
然后尝试通过ssh
连接语法是
ssh username @ hadoop
ssh username @ hadoop1
ssh username @ hadoop2
然后查看结果以确定系统是否连接。


I started following an online tutorial to configure multi ndoes on my single local VM. here is the hosts on master node:

127.0.0.1   localhost
192.168.96.132  hadoop
192.168.96.135  hadoop1
192.168.96.136  hadoop2

ssh:ALL:allow
sshd:ALL:allow

Here is the command that used to work:hdfs dfs -ls

Now I am seeing error message below:

ls: Call From hadoop/192.168.96.132 to hadoop:9000 failed on connection exception: 
java.net.ConnectException: Connection refused; 
For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

What is wrong with my configuration? where should I check and correct it?

Thank you very much.

解决方案

First try to 
ping hadoop,
ping hadoop1 and 
ping hadoop2.
Ex: ping hadoop 
Then just try to connect via ssh
The syntax is 
ssh username@hadoop
ssh username@hadoop1
ssh username@hadoop2
Then see the results to find out whether the systems are connecting or not.

这篇关于在配置多个节点后,hdfs dfs ls不工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆