Sqoop导出分区Hive表 [英] Sqoop exporting partitioned Hive table

查看:886
本文介绍了Sqoop导出分区Hive表的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

试图导出分区Hive表时遇到了一些问题。这是完全支持(我试图谷歌它,并发现一个JIRA票)?

  sqoop导出--connect jdbc:mysql :// localhost / testdb --table sales --export-dir / user / hive / warehouse / sales --direct 

以下是我得到的结果:

  00000_2,状态:FAILED 
java.io.FileNotFoundException :文件不存在:/ user / hive / warehouse / sales / day = 2013-04-01

运行

  hadoop fs -ls / user / hive / warehouse / sales / day = 2013-04-01 

显示此目录实际存在。

关于如何解决这个问题的任何想法?
在此先感谢,
Diddy

解决方案

Sqoop目前不支持导出递归目录。有一个JIRA SQOOP-951 用于添加此类支持。暂时的解决方法是当时导出一个分区或临时将数据复制到非分区表。


I encountered some problems when trying to export a partitioned Hive table. Is this fully supported (I tried to google for it and found one JIRA ticket)?

sqoop export --connect jdbc:mysql://localhost/testdb --table sales --export-dir /user/hive/warehouse/sales --direct

And here is what I get:

00000_2, Status : FAILED
java.io.FileNotFoundException: File does not exist: /user/hive/warehouse/sales/day=2013-04-01

Running

hadoop fs -ls /user/hive/warehouse/sales/day=2013-04-01

shows that this directory actually exists.

Any ideas on how to solve this? Thanks in advance, Diddy

解决方案

Sqoop currently do not support export for recursive directories. There is a JIRA SQOOP-951 for adding such support. The workaround for the time being is to export one partition at the time or temporarily copy your data to non partitioned table.

这篇关于Sqoop导出分区Hive表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆