hive.HiveImport:FAILED:SemanticException [错误10072]:数据库不存在: [英] hive.HiveImport: FAILED: SemanticException [Error 10072]: Database does not exist:

查看:5731
本文介绍了hive.HiveImport:FAILED:SemanticException [错误10072]:数据库不存在:的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图将MySQL数据库导入到Hive中,根据 Blog 有几种方式可以做到这一点


  1. >

所以我决定采用'非实时'方法,并且我已经设置了4节点的Hadoop集群, Sqoop和Hive可以与以下版本一起使用

名称 版本

Apache Hadoop 2.6.0
$ b Apache Hive hive-0.14.0
$ b Apache Sqoop
> sqoop-1.4.5.bin__hadoop-2.0.4-alpha



现在,当我尝试使用以下命令



导入命令

  sqoop-import -all-tables --verbose --connect jdbc:mysql:// XXXX / edgeowt --username root -P --hive-import --warehouse-dir / user / hive / warehouse --hive-database edgeowt.db  - -hive-overwrite 

然后我得到下面的错误

错误

  INFO hive.HiveImport:FAILED:SemanticException [Error 10072]:Database不存在:edgeowt.db 
15/04/16 13:32:09 ERROR tool.ImportAllTablesTool:遇到IOException运行导入作业:java.io.IOException:Hive退出状态88

我使用Hiveserver2登录并检查数据库,但是我能够看到给定的数据库

  $ HIVE_HOME / bin>直线
直线> !connect jdbc:hive2:// localhost:10000 scott tiger org.apache.hive.jdbc.HiveDriver
0:jdbc:hive2:// localhost:10000>显示数据库;
+ ---------------- + - +
| database_name |
+ ---------------- + - +
|默认|
| edgeowt |
+ ---------------- + - +

在查看 HDFS文件系统Web界面后,我意识到DB的 所有者是不同的

 权限所有者组大小复制块大小名称
drwxr-xr-x hduser supergroup 0 B 0 0 B候选
drwxr-xr- x scott supergroup 0 B 0 0 B edgeowt.db
drwxr-xr-x scott supergroup 0 B 0 0 B hd_temperature

,因为我试图使用 hduser 导入数据,并且数据库是使用 scott 用户创建的。
我尝试使用以下命令在 edgeowt.db 上向 hduser 授予所有预防版

  0:jdbc:hive2:// localhost:10000>授予所有数据库edgeowt TO USER hduser; 

并检查

  0:jdbc:hive2:// localhost:10000>在数据库上显示GRANT; 
+ ----------- + -------- + ------------ + --------- + - ---------------- + ----------------- + ------------ + - ------------- + ---------------- + ---------- + - +
|数据库|表| |分区|列| principal_name | principal_type |特权| grant_option | grant_time |设保人|
+ ----------- + -------- + ------------ + --------- + - ---------------- + ----------------- + ------------ + - ------------- + ---------------- + ---------- + - +
| edgeowt | | | | admin | ROLE | ALL | false | 1429170366000 |斯科特|
| edgeowt | | | | hduser | USER | ALL | false | 1429170906000 |斯科特|
+ ----------- + -------- + ------------ + --------- + - ---------------- + ----------------- + ------------ + - ------------- + ---------------- + ---------- + - +

但是无法解决错误,那么我该如何解决这个问题呢?或任何解决此问题的方法都是有帮助的。

〜/ .bashrc

 #设置JAVA_HOME(稍后我们也将直接为Hadoop配置JAVA_HOME)
export JAVA_HOME = / usr / lib / jvm / java-7-oracle
#设置与Hadoop相关的环境变量
export HADOOP_INSTALL = / opt / hadoop
export PATH = $ PATH:$ HADOOP_INSTALL / bin $ b $ export PATH = $ PATH:$ HADOOP_INSTALL / sbin $ b $ export HADOOP_MAPRED_HOME = $ HADOOP_INSTALL
export HADOOP_COMMON_HOME = $ HADOOP_INSTALL
export HADOOP_HDFS_HOME = $ HADOOP_INSTALL $ b $ export YARN_HOME = $ HADOOP_INSTALL $ b $ export HADOOP_HOME = $ HADOOP_INSTALL

#设置配置单元home
export HIVE_HOME = / opt / hive
export PATH = $ PATH:$ HIVE_HOME / bin
$ b $ Set HCatlog home
export HCAT_HOME = $ HIVE_HOME / hcatalog
export PATH = $ PATH:$ HCAT_HOME / bin

#Set Sqoop home
export HADOOP_OPTS =$ HADOOP_OPTS -Djava.library.path = $ HADOOP_HOME / lib / native
导出SQOOP_HOME = / opt / sqoop
expor t SQOOP_CONF_DIR =$ SQOOP_HOME / conf
export SQOOP_CLASSPATH =$ SQOOP_CONF_DIR
export PATH = $ PATH:$ SQOOP_HOME / bin

编辑



错误

  sqoop-import-all-tables --verbose --connect jdbc:mysql:// XXXX / edgeowt --username root -P --hive-import --warehouse-dir / user / hive / warehouse --hive-database edgeowt --hive-overwrite 


解决方案

最后,我在阅读关于同一问题的论坛讨论时自己得到了答案 此处



问题在于Hive Metastore配置 ,有三种类型的Hive Metastore配置


  1. 嵌入式Metastore(默认Metastore部署nt模式)。
  2. Locall Metastore。

  3. 远程Metastore。

并且我的Hive Metastore配置是默认配置。正如在 配置Hive Metastore
我将Metastore配置从 Embedded(默认)更改为 Remote Metastore ,并开始为我工作。



有关Metastore配置的更多信息,请使用Cloudera的以下文档: 配置Hive Metastore
$ b Sqoop命令

  sqoop-import-all-tables -connect jdbc:mysql:// XXXX / edgeowt --username root -P --hive-import --hive-database edgeowt --hive-overwrite -m 4 


I am trying to import MySQL database into Hive to analysis of large MySQL Data according to Blog there are couple of ways to do this

  1. Non realtime: Sqoop
  2. Realtime: Hadoop Applier for MySQL

so I decided to go with the 'Non realtime' approach and I have setup the Hadoop cluster with 4 node, Sqoop and Hive which working fine with following versions

Name Version

Apache Hadoop 2.6.0

Apache Hive hive-0.14.0

Apache Sqoop sqoop-1.4.5.bin__hadoop-2.0.4-alpha

Now when I am trying to import data using following command

Import Command

sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-database edgeowt.db  --hive-overwrite

then I am getting following error

Error

INFO hive.HiveImport: FAILED: SemanticException [Error 10072]: Database does not exist: edgeowt.db
15/04/16 13:32:09 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 88

I logged in with Hiveserver2 and check the database, but I can able to see the given database

$HIVE_HOME/bin>beeline
beeline> !connect jdbc:hive2://localhost:10000 scott tiger org.apache.hive.jdbc.HiveDriver
0: jdbc:hive2://localhost:10000> show databases;
+----------------+--+
| database_name  |
+----------------+--+
| default        |
| edgeowt        |
+----------------+--+

After looking into HDFS file System web interface I realize that the Owner of the DB is different

Permission  Owner   Group       Size    Replication Block Size  Name
drwxr-xr-x  hduser  supergroup  0 B 0       0 B     candidate
drwxr-xr-x  scott   supergroup  0 B 0       0 B     edgeowt.db
drwxr-xr-x  scott   supergroup  0 B 0       0 B     hd_temperature

as I am trying to import the data using hduser and and the database is created using scott user. I tried to grant ALL previlages to hduser on edgeowt.db using following command

0: jdbc:hive2://localhost:10000>GRANT ALL ON DATABASE edgeowt TO USER hduser;

and check with

0: jdbc:hive2://localhost:10000> SHOW GRANT ON DATABASE edgeowt;
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
| database  | table  | partition  | column  | principal_name  | principal_type  | privilege  | grant_option  |   grant_time   | grantor  |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+
| edgeowt   |        |            |         | admin           | ROLE            | ALL        | false         | 1429170366000  | scott    |
| edgeowt   |        |            |         | hduser          | USER            | ALL        | false         | 1429170906000  | scott    |
+-----------+--------+------------+---------+-----------------+-----------------+------------+---------------+----------------+----------+--+

but unable to solve the error , So how can I solve this problem ? or any point to solve this will be helpful.

~/.bashrc

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-7-oracle
# Set Hadoop-related environment variables
export HADOOP_INSTALL=/opt/hadoop
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_HOME=$HADOOP_INSTALL

# Set hive home 
export HIVE_HOME=/opt/hive
export PATH=$PATH:$HIVE_HOME/bin

# Set HCatlog home 
export HCAT_HOME=$HIVE_HOME/hcatalog
export PATH=$PATH:$HCAT_HOME/bin

# Set Sqoop home
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"
export SQOOP_HOME=/opt/sqoop
export SQOOP_CONF_DIR="$SQOOP_HOME/conf"
export SQOOP_CLASSPATH="$SQOOP_CONF_DIR"
export PATH=$PATH:$SQOOP_HOME/bin

EDIT

tried with following command still having same error

sqoop-import-all-tables --verbose --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --warehouse-dir /user/hive/warehouse --hive-database edgeowt --hive-overwrite

解决方案

Finally I got the answer by myself while reading the forum discussion about the same issue here.

The issue was with the Hive Metastore configuration, there are three types of Hive Metastore configurations

  1. Embedded Metastore (default metastore deployment mode).
  2. Locall Metastore.
  3. Remote Metastore.

and my Hive Metastore configuration was the default one. As mention in cloudera documentation of Configuring the Hive Metastore, I change the metastore configuration from Embedded (Default) to Remote Metastore and its start working for me.

for More information of Metastore configuration use following documentation of Cloudera.

Configuring the Hive Metastore

Sqoop Command

sqoop-import-all-tables --connect jdbc:mysql://X.X.X.X/edgeowt --username root -P --hive-import --hive-database edgeowt --hive-overwrite -m 4

这篇关于hive.HiveImport:FAILED:SemanticException [错误10072]:数据库不存在:的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆