如何将Hbase数据复制到本地文件系统(外部驱动器) [英] How to copy Hbase data to local file system (external drive)
问题描述
in / hbase org.apache.hadoop.mapreduce.Drive export table / media /.../ mydrive。
但实际上做了什么,在路径/media/.../mydrive hdfs创建一个新的目录,没有什么是保存在我的外部硬盘。为什么是这个问题。有没有办法告知,数据应该保存在我的外部驱动程序,而不是命令,我使用上面。
请回复
<如果数据在hdfs中,您可以使用下面的命令将数据从hdfs复制到本地,然后从本地系统复制,您可以对数据执行任何操作
$ p $
hadoop fs -get path_of_hdfs_system_where_your_data_is path_of_local_file_system
例如:
hadoop fs -get / user / hive / warehouse / rocky。 db / test5_bucket / 000002_0 / home / yarn
hadoop fs -copyToLocal /user/hive/warehouse/rocky.db/test5_bucket/000002_0 / home / yarn
I want to have a backup of hbase data that is in hdfs. I have an external drive ( usb hard disk). How can i copy data from hbase to my drive. I have used a command like
bin/hbase org.apache.hadoop.mapreduce.Drive export table /media/.../mydrive.
but what actually is done, that a new directory in hdfs with path /media/.../mydrive is created and nothing is save in my external hard. Why is this issue. Is there a way to inform that data should be saved in my external driver other than the command, i used above.
Please reply
If data is in hdfs you can use below commands to copy data from hdfs to local and then from local system, you can do what ever you want to do with data.
hadoop fs -get path_of_hdfs_system_where_your_data_is path_of_local_file_system
For example:
hadoop fs -get /user/hive/warehouse/rocky.db/test5_bucket/000002_0 /home/yarn
hadoop fs -copyToLocal /user/hive/warehouse/rocky.db/test5_bucket/000002_0 /home/yarn
这篇关于如何将Hbase数据复制到本地文件系统(外部驱动器)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!