运行hadoop时出错 [英] Errors while running hadoop

查看:99
本文介绍了运行hadoop时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

haduser@user-laptop:/usr/local/hadoop$ bin/hadoop dfs -copyFromLocal /tmp/input 
/user/haduser/input

11/12/14 14:21:00 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 0 time(s).

11/12/14 14:21:01 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 1 time(s).

11/12/14 14:21:02 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 2 time(s).

11/12/14 14:21:03 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 3 time(s).

11/12/14 14:21:04 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 4 time(s).

11/12/14 14:21:05 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 5 time(s).

11/12/14 14:21:06 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 6 time(s).

11/12/14 14:21:07 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. -Already tried 7 time(s).

11/12/14 14:21:08 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 8 time(s).

11/12/14 14:21:09 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 9 time(s).

Bad connection to FS. command aborted. exception: Call to localhost/127.0.0.1:54310 failed on connection exception: java.net.ConnectException: Connection refused

当我试图将文件从 / tmp / input 复制到 / user / haduser / input 即使文件 / etc / hosts 包含localhost的条目。
运行 jps命令时, TaskTracker namenode code>没有列出。

I am getting the above errors when I'm trying to copy files from /tmp/input to /user/haduser/input even though the file /etc/hosts contain entry for localhost. When the jps command is run, the TaskTracker and the namenode are not listed.

可能是什么问题?请有人帮我解决这个问题。

What could be the problem? Please someone help me with this.

推荐答案

我有类似的问题 - 实际上Hadoop绑定了IPv6。
然后我添加了 - export HADOOP_OPTS = -Djava.net.preferIPv4Stack = true $ HADOOP_HOME / conf / hadoop-env。 sh

I had similar issues - Actually Hadoop was binding to IPv6. Then I Added - "export HADOOP_OPTS=-Djava.net.preferIPv4Stack=true " to $HADOOP_HOME/conf/hadoop-env.sh

即使我在系统上禁用了IPv6,Hadoop仍然与IPv6绑定。
一旦我将它添加到env,开始工作正常。

Hadoop was binding to IPv6 even when I had disabled IPv6 on my system. Once I added it to env, started working fine.

希望这可以帮助别人。

这篇关于运行hadoop时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆