Hadoop:拒绝了start-dfs.sh权限 [英] Hadoop: start-dfs.sh permission denied

查看:1941
本文介绍了Hadoop:拒绝了start-dfs.sh权限的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在我的笔记本电脑上安装Hadoop。 SSH工作正常,但我无法启动hadoop。

  munichong @ GrindPad:〜$ ssh localhost 
欢迎使用Ubuntu 12.10(GNU / Linux 3.5.0-25-通用x86_64)

*文档:https://help.ubuntu.com/

0包可以更新。
0更新是安全更新。

上次登录:星期一3月4日00:01:36 2013 from localhost

munichong @ GrindPad:〜$ /usr/sbin/start-dfs.sh
chown:更改`/ var / log / hadoop / root'的所有权:不允许操作
启动namenode,记录到/var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out
/usr/sbin/hadoop-daemon.sh:第136行:/var/run/hadoop/hadoop-munichong-namenode.pid:权限被拒绝
usr / sbin / hadoop-daemon.sh:第135行:/ var /log/hadoop/root/hadoop-munichong-namenode-GrindPad.out:Permission denied
head:无法打开`/var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out'进行阅读:没有这样的文件或目录
localhost:chown:更改`/ var / log / hadoop / root'的所有权:不允许操作
localhost:启动datanode,记录到/ var / log / hadoop / root / hadoop-munichong-datanode-GrindPad.out
localhost:/usr/sbin/hadoop-daemon.sh:第135行:/var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out:权限被拒绝
localhos t:/usr/sbin/hadoop-daemon.sh第136行:/var/run/hadoop/hadoop-munichong-datanode.pid:权限被拒绝
localhost:head:无法打开`/ var / log / hadoop /root/hadoop-munichong-datanode-GrindPad.out'阅读:没有这样的文件或目录
localhost:chown:改变`/ var / log / hadoop / root'的所有权:操作不允许
localhost:启动secondarynamenode,记录到/var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out
localhost:/usr/sbin/hadoop-daemon.sh:第136行:/ var / run / hadoop / hadoop -munichong-secondarynamenode.pid:权限被拒绝
localhost:/usr/sbin/hadoop-daemon.sh:第135行:/var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out :权限被拒绝
localhost:head:无法打开`/var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out'阅读:没有这样的文件或目录

munichong @GrindPad:〜$ sudo /usr/sbin/start-dfs.sh
[sudo] munichong的密码:
起始namenode,记录到/var/log/hadoop/root/hadoop-root-namenode-GrindPad.out
localhost:权限被拒绝(公钥,密码)。
localhost:权限被拒绝(公钥,密码)。

我使用了sudo。但是许可仍然被拒绝。



有没有人可以帮助我?



预先感谢!

解决方案

问题的最后几个小时,但终于解决了。
我使用同一个用户提取的hadoop安装,与我用来运行hadoop的用户一样。所以用户权限不是问题。

我的配置是这样的:
Google Cloud上的Ubuntu linux机器。



安装Hadoop / home /
Hadoop数据目录/ var / lib / hadoop
和目录访问位是777,因此任何人都可以访问。
我在远程机器上进行了ssh修改配置文件并执行了start-dfs.sh,然后它给了我权限被拒绝(公钥)
所以这里是解决方案:
在同一个ssh终端中:


  1. ssh-keygen

2.它会询问文件夹的位置,它将复制密钥,我输入 /home/hadoop/.ssh/id_rsa



3.it会要求输入密码,为简单起见请保留为空。



4. cp / home / hadoop /.ssh/id_rsa.pub .ssh / authorized_keys
(要将新生成的公钥复制到用户home / .ssh中的auth文件directiry



dfs.sh
(现在它应该可以工作!)


I am installing Hadoop on my laptop. SSH works fine, but I cannot start hadoop.

munichong@GrindPad:~$ ssh localhost
Welcome to Ubuntu 12.10 (GNU/Linux 3.5.0-25-generic x86_64)

 * Documentation:  https://help.ubuntu.com/

0 packages can be updated.
0 updates are security updates.

Last login: Mon Mar  4 00:01:36 2013 from localhost

munichong@GrindPad:~$ /usr/sbin/start-dfs.sh
chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
starting namenode, logging to /var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-namenode.pid: Permission denied
usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out: Permission denied
head: cannot open `/var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out' for reading: No such file or directory
localhost: chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
localhost: starting datanode, logging to /var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out
localhost: /usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out: Permission denied
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-datanode.pid: Permission denied
localhost: head: cannot open `/var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out' for reading: No such file or directory
localhost: chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
localhost: starting secondarynamenode, logging to /var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-secondarynamenode.pid: Permission denied
localhost: /usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out: Permission denied
localhost: head: cannot open `/var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out' for reading: No such file or directory

munichong@GrindPad:~$ sudo /usr/sbin/start-dfs.sh
[sudo] password for munichong: 
starting namenode, logging to /var/log/hadoop/root/hadoop-root-namenode-GrindPad.out
localhost: Permission denied (publickey,password).
localhost: Permission denied (publickey,password).

I used "sudo". But the permission is still denied.

Is there anyone can help me?

Thanks in advance!

解决方案

I was stuck at the same issue for last couple of hours but finally solved it. I had the hadoop installation extracted by same user as one I am using to run hadoop. So user privilege is not issue.
My cofiguration is like this: Ubuntu linux machine on Google Cloud.

Hadoop installation /home/ Hadoop data directory /var/lib/hadoop and the directory access bits are 777 so anybody can access. I did ssh into the remote machine made changes to the config files and executed start-dfs.sh, then it gave me "Permission denied (Public key)" So here is the solution: In the same ssh terminal:

  1. ssh-keygen

2.It will ask for folder location where it will copy the keys, I entered /home/hadoop/.ssh/id_rsa

3.it will ask for pass phrase, keep it empty for simplicity.

4.cp /home/hadoop/.ssh/id_rsa.pub .ssh/authorized_keys (To copy the newly generated public key to auth file in your users home/.ssh directiry

  1. ssh localhost

  2. start-dfs.sh (Now it should work!)

这篇关于Hadoop:拒绝了start-dfs.sh权限的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆