sqoop导出本地csv到mapreduce上的MySQL错误 [英] sqoop export local csv to MySQL error on mapreduce

查看:707
本文介绍了sqoop导出本地csv到mapreduce上的MySQL错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图将本地csv文件导出到MySQL表test:

  $ sqoop export -fs local  - jt local --connect jdbc:mysql://172.16.21.64:3306 / cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv 

但是,我收到一个奇怪的错误,说 mapreduce.tar.gz 未找到:

 警告:/usr/hdp/2.5.0.0-1245/hbase不存在! HBase导入将失败。 
请将$ HBASE_HOME设置为HBase安装的根目录。
警告:/usr/hdp/2.5.0.0-1245/accumulo不存在! Accumulo进口将失败。
请将$ ACCUMULO_HOME设置为您的Accumulo安装的根目录。
17/04/07 14:22:14信息sqoop.Sqoop:运行Sqoop版本:1.4.6.2.5.0.0-1245
17/04/07 14:22:14 WARN fs.FileSystem :本地是不赞成使用的文件系统名称。改为使用file:///。
17/04/07 14:22:14 WARN tool.BaseSqoopTool:在命令行上设置密码是不安全的。考虑使用-P来代替。
17/04/07 14:22:15信息manager.MySQLManager:准备使用MySQL流结果集。
17/04/07 14:22:15 INFO tool.CodeGenTool:开始代码生成
17/04/07 14:22:15 INFO manager.SqlManager:执行SQL语句:SELECT t。* FROM `test2` as t LIMIT 1
17/04/07 14:22:15 INFO manager.SqlManager:执行SQL语句:SELECT t。* FROM`test2` AS t limit 1
17/04 / 07 14:22:15 INFO orm.CompilationManager:HADOOP_MAPRED_HOME是/usr/hdp/2.5.0.0-1245/hadoop-mapreduce
注意:/tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.java使用或覆盖一个弃用的API。
注意:使用-Xlint:deprecation重新编译以获取详细信息。
17/04/07 14:22:17 INFO orm.CompilationManager:编写jar文件:/tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.jar
17/04/07 14:22: 17 INFO mapreduce.ExportJobBase:开始导出test2
17/04/07 14:22:17 INFO jvm.JvmMetrics:使用processName = JobTracker,sessionId = $ b $初始化JVM度量标准17/04/07 14: 22:17错误tool.ExportTool:遇到IOException运行导出作业:java.io.FileNotFoundException:文件文件:/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz不存在
code>

该文件可在我的本地机器上使用:

  /usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz 

/data/hadoop/yarn/local/filecache/13/mapreduce.tar。 gz

任何人都知道是什么问题?我只是遵循这个指南:



http://ingest.tips/2015/02/06/use-sqoop-transfer-csv-data-local-filesystem-relational-database/

具有 /hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz 设置在 mapred-site.xml 。这是MapReduce框架归档的路径,并指向HDFS中的文件。

这里,Sqoop正在被 -fs local 触发,这个属性需要用一个LocalFS路径来设置。尝试使用mapreduce归档文件的本地路径覆盖此属性值。

  $ sqoop export -fs local -jt local -D 'mapreduce.application.framework.path = / usr / hdp / 2.5.0.0-1245 / hadoop / mapreduce.tar.gz'--connect jdbc:mysql://172.16.21.64:3306 / cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username  - password password --table test --export-dir file:///home/username/test.csv 


I was trying to export a local csv file to MySQL table "test":

$ sqoop export -fs local -jt local --connect jdbc:mysql://172.16.21.64:3306/cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv

However, I got a strange error saying mapreduce.tar.gz not found:

Warning: /usr/hdp/2.5.0.0-1245/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/04/07 14:22:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
17/04/07 14:22:14 WARN fs.FileSystem: "local" is a deprecated filesystem name. Use "file:///" instead.
17/04/07 14:22:14 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/04/07 14:22:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17/04/07 14:22:15 INFO tool.CodeGenTool: Beginning code generation
17/04/07 14:22:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test2` AS t LIMIT 1
17/04/07 14:22:15 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test2` AS t LIMIT 1
17/04/07 14:22:15 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.5.0.0-1245/hadoop-mapreduce
Note: /tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/04/07 14:22:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-bedrock/compile/009603476b0dfc767b1b94c0607bf6fa/test2.jar
17/04/07 14:22:17 INFO mapreduce.ExportJobBase: Beginning export of test2
17/04/07 14:22:17 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
17/04/07 14:22:17 ERROR tool.ExportTool: Encountered IOException running export job: java.io.FileNotFoundException: File file:/hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz does not exist

The file is however available at my local machine:

/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz

/data/hadoop/yarn/local/filecache/13/mapreduce.tar.gz

Anyone knows what is the issue? I was just following this guide:

http://ingest.tips/2015/02/06/use-sqoop-transfer-csv-data-local-filesystem-relational-database/

解决方案

The property mapreduce.application.framework.path has this value /hdp/apps/2.5.0.0-1245/mapreduce/mapreduce.tar.gz set in mapred-site.xml. This is the path for MapReduce framework archive and points to the file in HDFS.

Here, Sqoop being is triggered with -fs local this property needs to be set with a LocalFS path. Try overriding this property value with the local path of the mapreduce archive file.

$ sqoop export -fs local -jt local -D 'mapreduce.application.framework.path=/usr/hdp/2.5.0.0-1245/hadoop/mapreduce.tar.gz' --connect jdbc:mysql://172.16.21.64:3306/cf_ae07c762_41a9_4b46_af6c_a29ecb050204 --username username --password password --table test --export-dir file:///home/username/test.csv

这篇关于sqoop导出本地csv到mapreduce上的MySQL错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆