安装配置单元后,Hadoop(本地和主机目标不匹配) [英] Hadoop (local and host destination do not match) after installing hive

查看:114
本文介绍了安装配置单元后,Hadoop(本地和主机目标不匹配)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在运行ubuntu的笔记本电脑上安装了Hadoop 2.2作为单节点集群并运行字数统计范例。之后,我安装Hive和Hadoop开始给出错误,即

  hdfs dfs -ls throws IOException:localhost是utbuntu / 127.0。 1.1和目标主机是localhost:9000

我在我的主机文件中找到以下两项: p>

  127.0.0.1 localhost 
127.0.1.1 ubuntu
#和一些IPv6条目...

我的问题是为什么在配置配置单元后发生错误以及解决方案是什么?任何帮助真的很感谢。



谢谢!

在你原来的IOException中是一个错字' utbuntu '。你可以检查它是正确的主机名还是复制粘贴错误?

etc / hosts配置需要一些试验和错误来找出Hadoop 2.2 .0集群设置,但我做的是删除主机名的所有127.0.1.1分配,并将实际的IP分配给机器名称,它的工作原理。
例如


<192.168.1.101 ubuntu

我有一个2节点的集群,所以我的master / master / master / master / master / .0.1 localhost
#127.0.1.1 myhostname
192.168.1.100 myhostname
192.168.1.100 master

和/usr/local/hadoop/etc/hadoop/core-site.xml包含以下内容:

 <性> 
<名称> fs.default.name< /名称>
< value> hdfs:// master:9000< / value>
< / property>

最重要的是我已经将myhostname注释为127.0.1.1关联。 / p>

I have installed Hadoop 2.2 on my laptop running ubuntu as single node cluster and run the word count example. After that I installed Hive and Hadoop started to give error i.e.

hdfs dfs -ls throws IOException : localhost is "utbuntu/127.0.1.1 and destination host is localhost:9000"

I found the below two entries in my hosts file

127.0.0.1 localhost
127.0.1.1 ubuntu
#and some IPv6 entries...

My question is why it is giving error after configuring hive and what is the solution? Any help is really appreciated.

Thanks!

解决方案

There seems to be a typo 'utbuntu' in your original IOException. Can you please check it that's the right hostname or a copy-paste error?

The etc/hosts configs took a bit of trial and error to figure out for a Hadoop 2.2.0 cluster setup but what I did was remove all 127.0.1.1 assignments to the hostname and assigned the actual IP to the machine name and it works. e.g.

192.168.1.101 ubuntu

I have a 2-node cluster so my /etc/hosts for master (NameNode) looks like:

127.0.0.1   localhost
#127.0.1.1  myhostname
192.168.1.100   myhostname
192.168.1.100   master

And /usr/local/hadoop/etc/hadoop/core-site.xml has the following:

<property>
   <name>fs.default.name</name>
   <value>hdfs://master:9000</value>
 </property>

The main thing to note is that I've commented out the myhostname to 127.0.1.1 association.

这篇关于安装配置单元后,Hadoop(本地和主机目标不匹配)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆