Hadoop:start-dfs.sh 权限被拒绝 [英] Hadoop: start-dfs.sh permission denied
问题描述
我正在笔记本电脑上安装 Hadoop.SSH 工作正常,但我无法启动 hadoop.
I am installing Hadoop on my laptop. SSH works fine, but I cannot start hadoop.
munichong@GrindPad:~$ ssh localhost
Welcome to Ubuntu 12.10 (GNU/Linux 3.5.0-25-generic x86_64)
* Documentation: https://help.ubuntu.com/
0 packages can be updated.
0 updates are security updates.
Last login: Mon Mar 4 00:01:36 2013 from localhost
munichong@GrindPad:~$ /usr/sbin/start-dfs.sh
chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
starting namenode, logging to /var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out
/usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-namenode.pid: Permission denied
usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out: Permission denied
head: cannot open `/var/log/hadoop/root/hadoop-munichong-namenode-GrindPad.out' for reading: No such file or directory
localhost: chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
localhost: starting datanode, logging to /var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out
localhost: /usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out: Permission denied
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-datanode.pid: Permission denied
localhost: head: cannot open `/var/log/hadoop/root/hadoop-munichong-datanode-GrindPad.out' for reading: No such file or directory
localhost: chown: changing ownership of `/var/log/hadoop/root': Operation not permitted
localhost: starting secondarynamenode, logging to /var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out
localhost: /usr/sbin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-munichong-secondarynamenode.pid: Permission denied
localhost: /usr/sbin/hadoop-daemon.sh: line 135: /var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out: Permission denied
localhost: head: cannot open `/var/log/hadoop/root/hadoop-munichong-secondarynamenode-GrindPad.out' for reading: No such file or directory
munichong@GrindPad:~$ sudo /usr/sbin/start-dfs.sh
[sudo] password for munichong:
starting namenode, logging to /var/log/hadoop/root/hadoop-root-namenode-GrindPad.out
localhost: Permission denied (publickey,password).
localhost: Permission denied (publickey,password).
我使用了sudo".但是权限还是被拒绝了.
I used "sudo". But the permission is still denied.
有人可以帮我吗?
提前致谢!
推荐答案
我在过去几个小时都被困在同一问题上,但最终解决了它.我的 hadoop 安装是由与我用来运行 hadoop 的用户相同的用户提取的.所以用户权限不是问题.
我的配置是这样的:Google Cloud 上的 Ubuntu linux 机器.
I was stuck at the same issue for last couple of hours but finally solved it.
I had the hadoop installation extracted by same user as one I am using to run hadoop. So user privilege is not issue.
My cofiguration is like this:
Ubuntu linux machine on Google Cloud.
Hadoop 安装/home/Hadoop数据目录/var/lib/hadoop并且目录访问位是 777,所以任何人都可以访问.我在远程机器上 ssh 对配置文件进行了更改并执行了 start-dfs.sh,然后它给了我权限被拒绝(公钥)"所以这里是解决方案:在同一个 ssh 终端中:
Hadoop installation /home/ Hadoop data directory /var/lib/hadoop and the directory access bits are 777 so anybody can access. I did ssh into the remote machine made changes to the config files and executed start-dfs.sh, then it gave me "Permission denied (Public key)" So here is the solution: In the same ssh terminal:
- ssh-keygen
2.它会询问复制密钥的文件夹位置,我输入了/home/hadoop/.ssh/id_rsa
2.It will ask for folder location where it will copy the keys, I entered /home/hadoop/.ssh/id_rsa
3.它会要求输入密码,为简单起见将其留空.
3.it will ask for pass phrase, keep it empty for simplicity.
4.cat/home/hadoop/.ssh/id_rsa.pub >> .ssh/authorized_keys(将新生成的公钥复制到用户 home/.ssh 目录中的 auth 文件中)
4.cat /home/hadoop/.ssh/id_rsa.pub >> .ssh/authorized_keys (To copy the newly generated public key to auth file in your users home/.ssh directory)
ssh 本地主机
start-dfs.sh(现在它应该可以工作了!)
start-dfs.sh (Now it should work!)
这篇关于Hadoop:start-dfs.sh 权限被拒绝的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!