WEBHDFS REST API将文件从Windows Server /本地文件夹/桌面复制/移动到HDFS [英] WEBHDFS REST API to copy/move files from windows server/local folder/desktop to HDFS
问题描述
使用WEBHDFS REST API调用可以将文件从Windows机器(即Windows服务器或Windows本地文件夹或桌面)传输或复制到Hadoop-HDFS文件系统?
如果是,有任何示例命令信息?
我尝试过使用
Windows - >(使用ftp) - > Linux目录 - >(使用webhdfs) > HDFS,这是两个步骤的过程,我直接从Windows - >(webhdfs) - > HDFS寻找一个步骤过程。
我在 https ://hadoop.apache.org/docs/r1.0.4/webhdfs.html 以获取有用的信息。
示例:如果我的文件位于E:\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\'
目前我正在做的是
Step-1)ftp帐户文件,从Windows到linux目录。
步骤-2)运行curl命令将文件从linux机器移动到HDFS文件夹。
任何建议在一个步骤中完成它?
Step-1)Windows - >使用webhdfs一步完成的HDFS。
系统迁移到HDFS。
scp source_file_name user @ / path / file_name
我们也可以使用winscp工具来实现这一点。你可以安装它并建立一个连接到hdfs服务器,然后可以传输文件。
Using WEBHDFS REST API calls can i transfer or copy the files from Windows machine(i.e. windows server or windows local folder or desktop) to Hadoop-HDFS file system?
If yes any sample command info?
I have tried and i was able to do using
Windows->(using ftp)-> Linux directory -> (using webhdfs) -> HDFS and this is two step process and i am looking for one step process directly from Windows -> (webhdfs) -> HDFS.
I referred in https://hadoop.apache.org/docs/r1.0.4/webhdfs.html for helpful info also.
Example : if my file is in E:\user\accounts.txt and i want to move this file to HDFS /user/kumar/ folder using webhdfs.
Currently what i am doing is
Step-1) ftp accounts file from Windows to linux directory.
Step-2) running curl commands to move the file from linux machine to HDFS folders.
Any suggestion to do it in one step process? Step-1) Windows -> HDFS using webhdfs in one step.
We can copy files from windows file system to HDFS by using scp command.
scp source_file_name user@/path/file_name
and also we can achieve this by using winscp tool. you can install it and establish a connect to hdfs server then files can be transfer.
这篇关于WEBHDFS REST API将文件从Windows Server /本地文件夹/桌面复制/移动到HDFS的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!