从hdfs到oracle的sqoop导出错误 [英] sqoop export from hdfs to oracle Error
问题描述
使用的命令:
sqoop export --connect jdbc:oracle:thin:@ // xxx:1521 / BDWDEV4 - -username xxx --password xxx --table TW5T0 --export-dir'/ data / raw / oltp / cogen / oraclexport / TW5T0 / 2015-08-18'-m 8 --input-fields-terminated-by'\\ '001' - 以'\ n'结尾 - 由'\''--input-escaped-by'\''--input-optional-enclosed-by'\''
目标表在oracle中的列中包含数据类型date,但如错误显示,它将解析简单日期作为时间戳记
错误:
15/09/11 06:07:12信息mapreduce.Job:map 0 %降至0%15/09/11六时07分17秒INFO mapreduce.Job:任务标识:attempt_1438142065989_99811_m_000000_0,状态:失败错误:java.io.IOException异常:无法导出数据,请检查失败的地图任务日志
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMa pper.java:39)美元,org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145 B $ B)
在org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java: 64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
。在org.apache.hadoop.mapred.YarnChild $ 2.run(YarnChild.java:168)
在java.security.AccessController.doPrivileged(本机方法)
在javax.security.auth.Subject .doAs(Subject.java:415)
在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)在org.apache.hadoop.mapred.YarnChild.main
(YarnChild .java:163)由于:java.lang.RuntimeException:无法解析输入数据:'T201'$ T
在TZ401 .__ loadFromFields(TZ401.java:792)
在TZ401。在org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)$ b上解析(TZ401.java:645)
$ b java.lang.IllegalArgumentException异常:$ B ... 10更致时间戳格式必须是YYYY-MM-DD HH:MM:SS [.fffffffff]在java.sql.Timestamp.valueOf
(Timestamp.java :202)
at TZ401 .__ loadFromFields(TZ401.java:709)
... 12 more
<不要在Hadoop中更改数据文件,而应在sqoop导出中使用--map-column-java参数。
如果您有例如名为 DATE_COLUMN_1
和的两个
在你的Oracle表中,那么你可以添加下面的参数到你的sqoop命令中: DATE
DATE_COLUMN_2
- -map-column-java DATE_COLUMN_1 = java.sql.Date,DATE_COLUMN_2 = java.sql.Date
如前所述,您的Hadoop文本文件必须使用JDBC格式。但是在这种情况下, yyyy-mm-dd
会起作用。
Command used:
sqoop export --connect jdbc:oracle:thin:@//xxx:1521/BDWDEV4 --username xxx --password xxx --table TW5T0 --export-dir '/data/raw/oltp/cogen/oraclexport/TW5T0/2015-08-18' -m 8 --input-fields-terminated-by '\001' --lines-terminated-by '\n' --input-escaped-by '\"' --input-optionally-enclosed-by '\"'
The destination table has columns with datatype date in oracle but as show in error it is parsing simple date as timestamp
Error:
15/09/11 06:07:12 INFO mapreduce.Job: map 0% reduce 0% 15/09/11 06:07:17 INFO mapreduce.Job: Task Id : attempt_1438142065989_99811_m_000000_0, Status : FAILED Error: java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.lang.RuntimeException: Can't parse input data: '2015-08-15'
at TZ401.__loadFromFields(TZ401.java:792)
at TZ401.parse(TZ401.java:645)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more Caused by: java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
at java.sql.Timestamp.valueOf(Timestamp.java:202)
at TZ401.__loadFromFields(TZ401.java:709)
... 12 more
Instead of changing your data files in Hadoop, you should use the --map-column-java argument in your sqoop export.
If you have for example two DATE
columns named DATE_COLUMN_1
and DATE_COLUMN_2
in your Oracle table, then you can add the following argument to your sqoop command:
--map-column-java DATE_COLUMN_1=java.sql.Date,DATE_COLUMN_2=java.sql.Date
As mentioned before, the JDBC format has to be used in your Hadoop text file. But in this case yyyy-mm-dd
will work.
这篇关于从hdfs到oracle的sqoop导出错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!