Hadoop没有看到输入文件夹 [英] Hadoop does not see the input folder
问题描述
我试图在Ubuntu Server 14.04.3 LTS上安装hadoop 2.7.1(独立模式)。继主要的apache教程( https:// hadoop .apache.org / docs / stable / hadoop-project-dist / hadoop-common / SingleCluster.html ),我可以启动该进程并在端口50070看到dfshealth.html#tab-datanode。但我可以超越这一点不会超越。我被困在:将输入文件复制到分布式文件系统:($ bin / hdfs dfs -put etc / hadoop input)该文件在那里,但我收到以下异常:
I'm trying to install hadoop 2.7.1 (standalone mode) on Ubuntu Server 14.04.3 LTS. Following the main apache tutorial (https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html), I can start the process and see the dfshealth.html#tab-datanode at port 50070. But I can't go through beyond that point. I'm stuck on: "Copy the input files into the distributed filesystem:" ($ bin/hdfs dfs -put etc/hadoop input) The file is there but I'm getting the following exception:
joao @ vmdeb20:〜/ hadoop2_7_1 / hadoop-2.7.1 $ bin / hdfs dfs -put input
$ b $ put:`。':No这样的文件或目录
joao@vmdeb20:~/hadoop2_7_1/hadoop-2.7.1$ bin/hdfs dfs -put input
put: `.': No such file or directory
推荐答案
这是因为没有找到用户目录。如果未指定输出目录,Hadoop将默认复制输入到用户目录 / user / username 。
This is because the user directory not found. Hadoop will copy the input into the user directory /user/username by default if the output directory does not specified.
创建用户目录在Hadoop中或使用以下命令复制文件
Either create a user directory in Hadoop or copy the file using below command
bin/hdfs dfs -put input /
上述命令将输入目录复制到HDFS的根目录( / )。
The above command will copy the input directory into root (/) of HDFS.
这篇关于Hadoop没有看到输入文件夹的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!