hadoop fs -put命令 [英] hadoop fs -put command
本文介绍了hadoop fs -put命令的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我使用Cloudera CDH存储库在CentOS上构建了单节点Hadoop环境。当我想将本地文件复制到HDFS时,我使用了命令:
$ p $ sudo -u hdfs hadoop fs -put / root /MyHadoop/file1.txt /
但是,结果令我郁闷:
put:'/root/MyHadoop/file1.txt':没有这样的文件或目录
我确定这个档案确实存在。
请帮助我,谢谢!
解决方案 / (在你的本地硬盘)?通常你不会。
您必须将
您必须将
file1.txt
复制到 hdfs
用户具有读取权限的地方。 Try:
cp /root/MyHadoop/file1.txt / tmp
chown hdfs:hdfs /tmp/file1.txt
sudo -u hdfs hadoop fs -put /tmp/file1.txt /
---编辑:
I have constructed a single-node Hadoop environment on CentOS using the Cloudera CDH repository. When I want to copy a local file to HDFS, I used the command:
sudo -u hdfs hadoop fs -put /root/MyHadoop/file1.txt /
But,the result depressed me:
put: '/root/MyHadoop/file1.txt': No such file or directory
I'm sure this file does exist.
Please help me,Thanks!
解决方案
As user hdfs
, do you have access rights to /root/
(in your local hdd)?. Usually you don't.
You must copy file1.txt
to a place where hdfs
user has read rights.
Try:
cp /root/MyHadoop/file1.txt /tmp
chown hdfs:hdfs /tmp/file1.txt
sudo -u hdfs hadoop fs -put /tmp/file1.txt /
--- edit:
Take a look at the cleaner roman-nikitchenko's answer bellow.
这篇关于hadoop fs -put命令的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文