将数据写入Hadoop [英] Writing data to Hadoop

查看:74
本文介绍了将数据写入Hadoop的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要将数据写入Hadoop(HDFS)从外部来源,如窗口框。现在我一直在将数据拷贝到namenode上,并使用HDFS的put命令将其载入到群集中。在浏览代码时,我没有看到用于执行此操作的API。我希望有人能告诉我我错了,并且有一种简单的方法可以对外部客户端进行HDFS编码。 安装 安装Cygwin,在本地安装Hadoop(你只需要二进制文件并在你的NN上配置那个点 - 不需要真正运行这些服务),运行 hadoop fs -copyFromLocal / path / to / localfile / hdfs / path /



您也可以使用新的Cloudera桌面通过网络用户界面上传文件,虽然这可能不适合巨人文件。

还有一个HDFS的WebDAV覆盖图,但我不知道它有多稳定/可靠。


I need to write data in to Hadoop (HDFS) from external sources like a windows box. Right now I have been copying the data onto the namenode and using HDFS's put command to ingest it into the cluster. In my browsing of the code I didn't see an API for doing this. I am hoping someone can show me that I am wrong and there is an easy way to code external clients against HDFS.

解决方案

Install Cygwin, install Hadoop locally (you just need the binary and configs that point at your NN -- no need to actually run the services), run hadoop fs -copyFromLocal /path/to/localfile /hdfs/path/

You can also use the new Cloudera desktop to upload a file via the web UI, though that might not be a good option for giant files.

There's also a WebDAV overlay for HDFS but I don't know how stable/reliable that is.

这篇关于将数据写入Hadoop的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆