无法在 Hadoop 中启动 NameNode 守护程序和 DataNode 守护程序 [英] Can't start NameNode daemon and DataNode daemon in Hadoop

查看:59
本文介绍了无法在 Hadoop 中启动 NameNode 守护程序和 DataNode 守护程序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试以伪分布式模式运行 Hadoop.为此,我试图按照本教程 http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html

I am trying to run Hadoop in Pseudo-Distributed mode. For this I am trying to follow this tutorial http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html

我可以通过 ssh 连接到我的本地主机并格式化文件系统.但是,我无法通过此命令启动 NameNode 守护程序和 DataNode 守护程序:

I can ssh to my localhost and Format the filesystem. However, I can't start NameNode daemon and DataNode daemon by this command :

    sbin/start-dfs.sh

当我用 sudo 执行它时,我得到:

When I execute it with sudo I get:

    ubuntu@ip-172-31-42-67:/usr/local/hadoop-2.6.0$ sudo sbin/start-dfs.sh 
    Starting namenodes on [localhost]
    localhost: Permission denied (publickey).
    localhost: Permission denied (publickey).
    Starting secondary namenodes [0.0.0.0] 
    0.0.0.0: Permission denied (publickey).

并且在没有 sudo 的情况下执行时:

and when executed without sudo:

    ubuntu@ip-172-31-42-67:/usr/local/hadoop-2.6.0$ sbin/start-dfs.sh 
    Starting namenodes on [localhost]
    localhost: mkdir: cannot create directory ‘/usr/local/hadoop-2.6.0/logs’: Permission denied
    localhost: chown: cannot access ‘/usr/local/hadoop-2.6.0/logs’: No such file or directory
    localhost: starting namenode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out
    localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out: No such file or directory
    localhost: head: cannot open ‘/usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out’ for reading: No such file or directory
    localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out: No such file or directory
    localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-namenode-ip-172-31-42-67.out: No such file or directory
    localhost: mkdir: cannot create directory ‘/usr/local/hadoop-2.6.0/logs’: Permission denied
    localhost: chown: cannot access ‘/usr/local/hadoop-2.6.0/logs’: No such file or directory
    localhost: starting datanode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out
    localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out: No such file or directory
    localhost: head: cannot open ‘/usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out’ for reading: No such file or directory
    localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out: No such file or directory
    localhost: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-datanode-ip-172-31-42-67.out: No such file or directory
    Starting secondary namenodes [0.0.0.0]
    0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop-2.6.0/logs’: Permission denied
    0.0.0.0: chown: cannot access ‘/usr/local/hadoop-2.6.0/logs’: No such file or directory
    0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out
    0.0.0.0: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out: No such file or directory
    0.0.0.0: head: cannot open ‘/usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out’ for reading: No such file or directory
    0.0.0.0: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out: No such file or directory
    0.0.0.0: /usr/local/hadoop-2.6.0/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop-2.6.0/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-42-67.out: No such file or directory

我现在也注意到,当执行 ls 来检查 hfs 目录的内容时,它会失败:

I also notice now that when executing ls to check content of hfs directories like here, it fails:

   ubuntu@ip-172-31-42-67:~/dir$ hdfs dfs -ls output/
   ls: Call From ip-172-31-42-67.us-west-2.compute.internal/172.31.42.67 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

谁能告诉我可能是什么问题?

Can anyone tell me what could be the problem ?

推荐答案

上述错误表明存在权限问题.您必须确保 hadoop 用户对/usr/local/hadoop 具有适当的权限.为此,您可以尝试:

The errors above suggest a permissions problem. You have to make sure that the hadoop user has the proper privileges to /usr/local/hadoop. For this purpose you can try:

   sudo chown -R hadoop /usr/local/hadoop/

   sudo chmod 777 /usr/local/hadoop/

这篇关于无法在 Hadoop 中启动 NameNode 守护程序和 DataNode 守护程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆