Hadoop的start-all.sh错误:没有这样的文件或目录 [英] Hadoop start-all.sh error:No such file or directory

查看:4330
本文介绍了Hadoop的start-all.sh错误:没有这样的文件或目录的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我成功地创建了名称节点后,我试图启动名称节点时,就遇到了这个问题。
对于我来说,好像它试图登录到一个不存在的文件。我怎么可以改变我的设置的脚本日志引导到正确的目录?

 的bash-3.2 $ start-all.sh
开始的NameNode,记录到/usr/local/bin/../logs/hadoop-Yili-namenode-wifi169-
116.bucknell.edu.out
不错:/usr/local/bin/../bin/hadoop:没有这样的文件或目录
本地主机:启动数据节点,记录到/usr/local/bin/../logs/hadoop-Yili-datanode-
wifi169-116.bucknell.edu.out
本地主机:不错:/usr/local/bin/../bin/hadoop:没有这样的文件或目录
本地主机:启动secondarynamenode,记录到/usr/local/bin/../logs/hadoop-Yili-
secondarynamenode-wifi169-116.bucknell.edu.out
本地主机:不错:/usr/local/bin/../bin/hadoop:没有这样的文件或目录
开始的JobTracker,记录到/usr/local/bin/../logs/hadoop-Yili-jobtracker-wifi169-
116.bucknell.edu.out
不错:/usr/local/bin/../bin/hadoop:没有这样的文件或目录
本地主机:开始的TaskTracker,记录到/usr/local/bin/../logs/hadoop-Yili-
的TaskTracker-wifi169-116.bucknell.edu.out
本地主机:不错:/usr/local/bin/../bin/hadoop:没有这样的文件或目录


解决方案

试着运行其中的hadoop 。如果此命令为您提供了一个输出那么你HADOOP_HOME已经在.bashrc文件中设置的。

如果没有设置,那么在你的主目录中编辑.bashrc文件,并添加以下考虑你的Hadoop报表安装在的/ opt / Hadoop的。它可以是其他位置。

  HADOOP_HOME = /选择/ HADOOP
出口HADOOP_HOME
PATH = $ PATH:$ HADOOP_HOME / bin中
导出路径

这会帮助你。

After I successfully created the name node, I ran into this problem when trying to start name node. For me it seems as if it's trying to log to a file that does not exist. How could I change my setup to direct the script log to the correct directory?

bash-3.2$ start-all.sh
starting namenode, logging to /usr/local/bin/../logs/hadoop-Yili-namenode-wifi169-
116.bucknell.edu.out
nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting datanode, logging to /usr/local/bin/../logs/hadoop-Yili-datanode-
wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting secondarynamenode, logging to /usr/local/bin/../logs/hadoop-Yili-
secondarynamenode-wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
starting jobtracker, logging to /usr/local/bin/../logs/hadoop-Yili-jobtracker-wifi169-
116.bucknell.edu.out
nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting tasktracker, logging to /usr/local/bin/../logs/hadoop-Yili- 
tasktracker-wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory

解决方案

Try to run which hadoop. If this command gives you an output then your HADOOP_HOME has been set in .bashrc file.

If not set then edit .bashrc file in your home directory and add below statements considering your hadoop is installed in /opt/hadoop. It may be another location.

HADOOP_HOME=/opt/HADOOP
export HADOOP_HOME
PATH=$PATH:$HADOOP_HOME/bin
export PATH

This will help you.

这篇关于Hadoop的start-all.sh错误:没有这样的文件或目录的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆