Sqoop函数'--map-column-hive'被忽略 [英] Sqoop function '--map-column-hive' being ignored

查看:2851
本文介绍了Sqoop函数'--map-column-hive'被忽略的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图将文件导入配置单元中,并且--map-column-hive column_name = timestamp被忽略。列'column_name'最初是在sql中键入datetime,并将其转换为parquet中的bigint。我想通过sqoop将它转换为时间戳格式,但它不起作用。

I am trying to import a file into hive as parquet and the --map-column-hive column_name=timestamp is being ignored. The column 'column_name' is originally of type datetime in sql and it converts it into bigint in parquet. I want to convert it to timestamp format through sqoop but it is not working.

sqoop import \

--table table_name \

--driver com.microsoft.sqlserver.jdbc.SQLServerDriver \

--connect jdbc:sqlserver://servername \

--username user --password pw \

--map-column-hive column_name=timestamp\

--as-parquetfile \

--hive-import \

--hive-table table_name -m 1

当我在配置单元中查看该表时,它仍然显示具有原始数据类型的列。

When I view the table in hive, it still shows the column with its original datatype.

我试过column_name =字符串,也没有工作。

I tried column_name=string and that did not work either.

我认为这可能是一个问题与转换文件拼花地板,但我不知道。有没有人有解决方案来解决这个问题?

I think this may be an issue with converting files to parquet but I am not sure. Does anyone have a solution to fix this?

运行命令时我没有遇到任何错误,只是完成了导入,就好像命令不存在一样。

I get no errors when running the command, it just completes the import as if the command was did not exist.

推荐答案

在hive 1.2版本之前ParquetSerde中的Timestmap支持不是avabile。在1.1.0中只有二进制数据类型支持。

Before hive 1.2 version Timestmap support in ParquetSerde is not avabile. Only binary data type support is available in 1.1.0.

请检查链接

请将您的版本升级到1.2版本后,它应该可以正常工作。 p>

Please upgrade your version to 1.2 and after ,it should work.

Please check the issue log and release notes below.

https://issues.apache.org/jira/browse/HIVE-6384

https://issues.apache.org/jira/secure/ReleaseNote.jspa?version=12329345&styleName=Text&projectId=12310843

这篇关于Sqoop函数'--map-column-hive'被忽略的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆