从Hadoop复制到本地计算机 [英] Copy from Hadoop to local machine

查看:163
本文介绍了从Hadoop复制到本地计算机的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我可以ssh到我们的盒子中,然后执行hadoop fs -ls /theFolder并浏览文件,等等.但这也就是我所知道的:)我的目标是将其中一个文件-它们是Avro-复制到我的文件中本地主文件夹.

I can ssh to our box and do a hadoop fs -ls /theFolder and browse in for the files, etc.. but that's all I know too :) My goal is to copy one of those files - they are Avro - on to my local home folder.

该怎么办?我也找到了get命令,但不确定如何起诉.

How can do this? I found also a get command but not sure how to sue that either.

推荐答案

首先,使用hadoop fs -get /theFolder将该文件复制到您要放在盒子中的当前目录中.

First, use hadoop fs -get /theFolder to copy it into the current directory you are ssh'ed into on your box.

然后,您可以使用scp或我的偏好设置rsync在盒子和本地系统之间复制文件,就像这样.这是在使用-get之后仍在同一目录中的我如何使用rsync的方法:

Then you can use either scp or my preference of rsync to copy the files between your box and your local system like so. Here's how I'd use rsync after having used the -get, still in the same directory:

rsync -av ./theFolder username@yourlocalmachine:/home/username

这会将theFolder从包装盒上的本地fs复制到计算机fs上的主文件夹中.在这两种情况下,请确保将username替换为您的实际用户名,并将yourlocalmachine替换为计算机的主机名或IP地址.

This will copy theFolder from the local fs on your box into your home folder on your machine's fs. Be sure to replace username with your actual username in both cases, and yourlocalmachine with your machine's hostname or ip address.

这篇关于从Hadoop复制到本地计算机的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆