错误hive.HiveConfig:无法加载org.apache.hadoop.hive.conf.HiveConf。确保正确设置HIVE_CONF _DIR [英] ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF _DIR is set correctly

查看:400
本文介绍了错误hive.HiveConfig:无法加载org.apache.hadoop.hive.conf.HiveConf。确保正确设置HIVE_CONF _DIR的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将数据从sqoop导入到配置单元中



MySQL

 使用样本; 

创建表forhive(id int auto_increment,
姓氏varchar(36),
姓氏varchar(36),
主键(id)
) ;

插入forhive(firstname,lastname)values( sample, singh);

select * from forhive;
b b

2 vijay sharma



3个样本singh


这是我正在使用的Sqoop命令(版本1.4.7)

  sqoop import --connect jdbc :mysql:// ********:3306 / sample 

--table forhive --split-by id --columns id,firstname,lastname

--target-dir / home / programmeur_v / forhive

--hive-import --create-hive-table --hive-table sqp.forhive --username vaibhav -P

这是我遇到的错误



错误日志


18/08/02 19:19:49信息sqoop.Sqoop:正在运行的Sqoop版本:1.4.7



输入密码:



18/08/02 19:19:55 INFO工具。BaseSqoopTool:使用特定于Hive的
分隔符进行输出。您可以覆盖



18/08/02 19:19:55 INFO工具。BaseSqoopTool:使用
分隔符--fields-terminated-by等。



18/08/02 19:19:55信息管理器。MySQLManager:准备使用MySQL
流式结果集。



18/08/02 19:19:55 INFO工具。CodeGenTool:开始生成代码



18/08/02 19:19:56 INFO manager.SqlManager:执行SQL语句:
SELECT t。* FROM forhive AS t LIMIT 1



18/08/02 19:19:56 INFO manager.SqlManager:执行SQL语句:
SELECT t。* FROM forhive AS t LIMIT 1



02/08/18 19:19:56信息orm.CompilationManager:HADOOP_MAPRED_HOME是
/home/programmeur_v/softwares/hadoop-2.9.1



注意:
/tmp/sqoop-programmeur_v/compile/e8ffa12496a2e421f80e1fa16e025d28/forhive.java
使用或覆盖已弃用的API。



注:使用-Xlint:deprecation重新编译 细节。 18/08/02 19:19:58
信息orm.CompilationManager:编写jar文件:
/tmp/sqoop-programmeur_v/compile/e8ffa12496a2e421f80e1fa16e025d28/forhive.jar



18/08/02 19:19:58 WARN manager.MySQLManager:看来您是从mysql导入



18 / 08/02 19:19:58 WARN manager.MySQLManager:此传输可以更快
!使用--direct



18/08/02 19:19:58 WARN manager.MySQLManager:选项可以执行
MySQL特定的快速路径。 / p>

18/08/02 19:19:58 INFO manager.MySQLManager:将零DATETIME
行为设置为convertToNull(mysql)



18/08/02 19:19:58 INFO mapreduce.ImportJobBase:开始导入
蜂箱



02/08/18 19:19:58 INFO Configuration.deprecation:不建议
使用mapred.jar。相反,请使用mapreduce.job.jar



18/08/02 19:19:59信息Configuration.deprecation:mapred.map.tasks已弃用
。而是使用mapreduce.job.maps



18/08/02 19:19:59 INFO client.RMProxy:以/0.0.0.0连接到ResourceManager
:8032



18/08/02 19:20:02 INFO db.DBInputFormat:使用读取提交的
事务隔离



18/08/02 19:20:02 INFO db.DataDrivenDBInputFormat:BoundingValsQuery:
SELECT MIN( id ),MAX( id )来自 forhive



18/08/02 19: 20:02 INFO db.IntegerSplitter:分割大小:0;分割数:
4从:1到:3



18/08/02 19:20:02 INFO mapreduce.JobSubmitter:分割数:3



18/08/02 19:20:02信息Configuration.deprecation:不推荐使用
yarn.resourcemanager.system-metrics-publisher.enabled。
而是使用yarn.system-metrics-publisher.enabl ed



18/08/02 19:20:02 INFO mapreduce.JobSubmitter:提交令牌
工作:job_1533231535061_0006



18/08/02 19:20:03 INFO impl.YarnClientImpl:提交的应用程序
application_1533231535061_0006



18/08/02 19:20:03 INFO mapreduce.Job:跟踪工作的网址:
http:// instance-1:8088 / proxy / application_1533231535061_0006 /



18/08/02 19:20:03 INFO mapreduce。工作:正在运行的作业:
job_1533231535061_0006



18/08/02 19:20:11 INFO mapreduce.Job:工作作业_1533231535061_0006在超级模式下运行的
:false



18/08/02 19:20:11 INFO mapreduce.Job:地图0%减少0%



18/08/02 19:20:21信息mapreduce。工作:地图33%减少0%



18/08 / 02 19:20:24 INFO映射uce.Job:地图100%减少0%



18/08/02 19:20:25 INFO mapreduce.Job:作业job_1533231535061_0006
成功完成



18/08/02 19:20:25 INFO mapreduce.Job:计数器:31


 文件系统计数器
FILE:读取的字节数= 0
FILE:写入的字节数= 622830
FILE:读取操作数= 0
文件:大型读取操作数= 0
文件:写入操作数= 0
HDFS:读取的字节数= 295
HDFS:写入的字节数= 48
HDFS:读取操作数= 12
HDFS:大型读取操作数=
HDFS:写入操作数= 6
作业计数器
被杀死的映射任务= 1
启动的地图任务= 3
其他本地地图任务= 3
所有地图在占用插槽中花费的总时间(毫秒)= 27404
总时间所有被减少的占用的占用时隙(ms)= 0
所有地图任务花费的总时间(ms)= 27404
所有地图任务花费的总vcore-毫秒数= 27404
总兆字节数所有地图任务花费的毫秒数= 28061696
Map-Reduce框架
地图输入记录= 3
地图输出记录= 3
输入拆分字节= 295
溢出记录= 0
随机重排失败= 0
合并的地图输出= 0
已花费的GC时间(ms)= 671
花费的CPU时间(ms)= 4210
物理内存(字节) )快照= 616452096
虚拟内存(字节)快照= 5963145216
已提交的总堆使用量(字节)= 350224384
文件输入格式计数器
读取的字节数= 0
文件输出格式计数器
写入的字节数= 48


18 / 08/02 19:20:25 INFO mapreduce.ImportJobBase:传输了48个字节的
25.828秒(1.8584字节/秒)



18/08/02 19:20:25 INFO mapreduce.ImportJobBase:检索了3条记录。



18/08/02 19:20:25 INFO mapreduce.ImportJobBase:将Hive / Hcat
导入作业数据导入到侦听器以获取表中的



18/08/02 19:20:25 INFO manager.SqlManager:执行SQL语句:
SELECT t。* FROM forhive AS t LIMIT 1



18/08/02 19:20:25 INFO hive.HiveImport:将上传的数据加载到
Hive



18/08/02 19:20:25错误hive.HiveConfig:无法加载
org.apache.hadoop.hive.conf.HiveConf。确保正确设置了HIVE_CONF_DIR。



18/08/02 19:20:25错误工具。导入工具:导入失败:
java。 io.IOException:java.lang.ClassNotFoundException:
org.apache.hadoop.hive.conf.HiveConf
在org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
在org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
在org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
在org .apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
在org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
在org.apache.sqoop .tool.ImportTool.importTable(ImportTool.java:537)
在org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
在org.apache.sqoop.Sqoop.run (Sqoop.java:147)
在org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
在org.apache.sqoop。 Sqoop.runSqoop(Sqoop.java:183)
在org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
在org.apache.sqoop.Sqoop.runTool(Sqoop.java: 243)
在org.apache.sqoop.Sqoop.main(Sqoop.java:252)原因:java.lang.ClassNotFoundException:org.apache.hadoop.hive.conf.HiveConf
在java。 net.URLClassLoader.findClass(URLClassLoader.java:381)
在java.lang.ClassLoader.loadClass(ClassLoader.java:424)
在sun.misc.Launcher $ AppClassLoader.loadClass(Launcher.java: 349)java.lang.ClassLoader.loadClass(ClassLoader.java:357)
java.lang.Class.forName0(本地方法)
java.lang.Class.forName(本地方法) Class.java:264)
at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
...另外12个


在谷歌搜索相同的错误后,我也将HIVE_CONF_DIR添加到了我的bashrc


export HIVE_HOM E = / home / programmeur_v / softwares / apache-hive-1.2.2-bin



export
HIVE_CONF_DIR = / home / programmeur_v / softwares / apache-hive -1.2.2-bin / conf



导出
PATH = $ PATH:$ JAVA_HOME / bin:$ HADOOP_HOME / bin:$ HIVE_HOME / bin:$ SQOOP_HOME / bin:$ HIVE_CONF_DIR


我所有的Hadoop服务也已启动并运行。


6976 NameNode



7286 SecondaryNameNode



7559 NodeManager



7448 ResourceManager



8522 DataNode



14587 Jps


我只是无法弄清楚我在这里犯了什么错误。

解决方案

您需要下载文件 hive-common-0.10.0.jar 并将其复制到 $ SQOOP_HOME / lib 文件夹。


I am trying to import data from sqoop to hive

MySQL

use sample;

create table forhive(   id int auto_increment,
    firstname varchar(36),
    lastname varchar(36),
    primary key(id)
    );    

insert into  forhive(firstname, lastname) values("sample","singh");

select * from forhive;

1 abhay agrawal

2 vijay sharma

3 sample singh

This is the Sqoop command I'm using (version 1.4.7)

sqoop import --connect jdbc:mysql://********:3306/sample 

--table forhive --split-by id --columns id,firstname,lastname  

--target-dir /home/programmeur_v/forhive 

--hive-import --create-hive-table --hive-table sqp.forhive --username vaibhav -P

This is the error I'm getting

Error Log

18/08/02 19:19:49 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7

Enter password:

18/08/02 19:19:55 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override

18/08/02 19:19:55 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.

18/08/02 19:19:55 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.

18/08/02 19:19:55 INFO tool.CodeGenTool: Beginning code generation

18/08/02 19:19:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM forhive AS t LIMIT 1

18/08/02 19:19:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM forhive AS t LIMIT 1

18/08/02 19:19:56 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/programmeur_v/softwares/hadoop-2.9.1

Note: /tmp/sqoop-programmeur_v/compile/e8ffa12496a2e421f80e1fa16e025d28/forhive.java uses or overrides a deprecated API.

Note: Recompile with -Xlint:deprecation for details. 18/08/02 19:19:58 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-programmeur_v/compile/e8ffa12496a2e421f80e1fa16e025d28/forhive.jar

18/08/02 19:19:58 WARN manager.MySQLManager: It looks like you are importing from mysql.

18/08/02 19:19:58 WARN manager.MySQLManager: This transfer can be faster! Use the --direct

18/08/02 19:19:58 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.

18/08/02 19:19:58 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)

18/08/02 19:19:58 INFO mapreduce.ImportJobBase: Beginning import of forhive

18/08/02 19:19:58 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar

18/08/02 19:19:59 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps

18/08/02 19:19:59 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032

18/08/02 19:20:02 INFO db.DBInputFormat: Using read commited transaction isolation

18/08/02 19:20:02 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(id), MAX(id) FROM forhive

18/08/02 19:20:02 INFO db.IntegerSplitter: Split size: 0; Num splits: 4 from: 1 to: 3

18/08/02 19:20:02 INFO mapreduce.JobSubmitter: number of splits:3

18/08/02 19:20:02 INFO Configuration.deprecation: yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabl ed

18/08/02 19:20:02 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1533231535061_0006

18/08/02 19:20:03 INFO impl.YarnClientImpl: Submitted application application_1533231535061_0006

18/08/02 19:20:03 INFO mapreduce.Job: The url to track the job: http://instance-1:8088/proxy/application_1533231535061_0006/

18/08/02 19:20:03 INFO mapreduce.Job: Running job: job_1533231535061_0006

18/08/02 19:20:11 INFO mapreduce.Job: Job job_1533231535061_0006 running in uber mode : false

18/08/02 19:20:11 INFO mapreduce.Job: map 0% reduce 0%

18/08/02 19:20:21 INFO mapreduce.Job: map 33% reduce 0%

18/08/02 19:20:24 INFO mapreduce.Job: map 100% reduce 0%

18/08/02 19:20:25 INFO mapreduce.Job: Job job_1533231535061_0006 completed successfully

18/08/02 19:20:25 INFO mapreduce.Job: Counters: 31

        File System Counters
        FILE: Number of bytes read=0
        FILE: Number of bytes written=622830
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=295
        HDFS: Number of bytes written=48
        HDFS: Number of read operations=12
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=6
        Job Counters 
        Killed map tasks=1
        Launched map tasks=3
        Other local map tasks=3
        Total time spent by all maps in occupied slots (ms)=27404
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=27404
        Total vcore-milliseconds taken by all map tasks=27404
        Total megabyte-milliseconds taken by all map tasks=28061696
        Map-Reduce Framework
        Map input records=3
        Map output records=3
        Input split bytes=295
        Spilled Records=0
        Failed Shuffles=0
        Merged Map outputs=0
        GC time elapsed (ms)=671
        CPU time spent (ms)=4210
        Physical memory (bytes) snapshot=616452096
        Virtual memory (bytes) snapshot=5963145216
        Total committed heap usage (bytes)=350224384
        File Input Format Counters 
        Bytes Read=0
        File Output Format Counters 
        Bytes Written=48

18/08/02 19:20:25 INFO mapreduce.ImportJobBase: Transferred 48 bytes in 25.828 seconds (1.8584 bytes/sec)

18/08/02 19:20:25 INFO mapreduce.ImportJobBase: Retrieved 3 records.

18/08/02 19:20:25 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table forhive

18/08/02 19:20:25 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM forhive AS t LIMIT 1

18/08/02 19:20:25 INFO hive.HiveImport: Loading uploaded data into Hive

18/08/02 19:20:25 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.

18/08/02 19:20:25 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50) at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392) at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379) at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337) at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at org.apache.sqoop.Sqoop.main(Sqoop.java:252) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44) ... 12 more

After I did google for the same error I added HIVE_CONF_DIR also to my bashrc

export HIVE_HOME=/home/programmeur_v/softwares/apache-hive-1.2.2-bin

export HIVE_CONF_DIR=/home/programmeur_v/softwares/apache-hive-1.2.2-bin/conf

export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin:$SQOOP_HOME/bin:$HIVE_CONF_DIR

All my Hadoop services are also up and running.

6976 NameNode

7286 SecondaryNameNode

7559 NodeManager

7448 ResourceManager

8522 DataNode

14587 Jps

I'm just unable to figure out what mistake I'm making here. Please guide!

解决方案

You need to download the file hive-common-0.10.0.jar and copy it to $SQOOP_HOME/lib folder.

这篇关于错误hive.HiveConfig:无法加载org.apache.hadoop.hive.conf.HiveConf。确保正确设置HIVE_CONF _DIR的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆