在Ubuntu上安装配置单元(德比问题?) [英] Installing hive on ubuntu (trouble with derby ?)

查看:151
本文介绍了在Ubuntu上安装配置单元(德比问题?)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经安装了Hadoop,Spark,R,Rstudio-server和SparkR,现在我正在尝试安装Hive。

这是我所做的:

  $ cd / home / francois-ubuntu / media / 
$ mkdir install-hive
$ cd install-hive
$ wget http://mirrors.ircam.fr/pub/apache/hive/hive-2.1.0/apache-hive-2.1.0-bin.tar.gz
$ tar -xzvf apache-hive-2.1.0-bin.tar.gz
$ mkdir / usr / lib / hive
$ mv apache-hive-2.1.0-bin / usr / lib / hive
$ cd
$ rm -rf / home / francois-ubuntu / media / install-hive
$ sudo vim〜/ .bashrc

.bashrc 中,我写了以下内容(我还包括了相对于Java,Hadoop和Spark,可能会有所帮助):

 #设置JAVA_HOME 
export JAVA_HOME = / usr / lib / jvm / java-8 -d openjdk-amd64

设置HADOOP_HOME
别名hadoop = / usr / local / hadoop / bin / hadoop
export HADOOP_HOME = / usr / local / hadoop
导出PAT H = $ PATH:$ HADOOP_HOME / bin

#设置SPARK_HOME
export SPARK_HOME = / usr / local / spark

#设置HIVE_HOME
export HIVE_HOME = / usr / lib / hive / apache-hive-2.1.0-bin
PATH = $ PATH:$ HIVE_HOME / bin
export PATH

返回CLI:

  $ cd / usr / lib /hive/apache-hive-2.1.0-bin/bin 
$ sudo vim hive-config.sh

在hive-config.sh中,我添加:

  export HADOOP_HOME = / usr / local / hadoop 

然后:wq ,回到CLI:

  $ hadoop fs -mkdir / usr / hive / warehouse 
$ hadoop fs -chmod g + w / usr / hive /仓库

然后最后:

  $ hive 

以下是我得到的结果:

  SLF4J:类路径包含多个SLF4J绑定。 
SLF4J:在[jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/ org / slf4j / impl / StaticLoggerBinder.class]
SLF4J:在[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/ org / slf4j / impl / StaticLoggerBinder.class]
SLF4J:请参阅http://www.slf4j.org/codes.html#multiple_bindings以获取解释。
SLF4J:实际绑定类型为[org.apache.logging.slf4j.Log4jLoggerFactory] ​​

使用jar中的配置初始化日志:file:/ usr / lib / hive / apache-hive- 2.1.0-bin / lib / hive-common-2.1.0.jar!/hive-log4j2.properties Async:true
Mon Jul 18 12:13:44 CEST 2016 Thread [main,5,main] java .io.FileNotFoundException:derby.log(Permission denied)
---------------------------------- ------------------------------
星期一7月18日12:13:45 CEST 2016:
启动Derby(版本The Apache Software Foundation - Apache Derby - 10.10.2.0 - (1582446))实例a816c00e-0155-fd7f-479a-0000040c9aa0
数据库目录/usr/lib/hive/apache-hive-2.1.0 -bin / bin / metastore_db以READ ONLY模式使用类加载器sun.misc.Launcher$AppClassLoader@2e5c649。
从文件加载:/usr/lib/hive/apache-hive-2.1.0-bin/lib/derby-10.10.2.0.jar。
java.vendor = Oracle Corporation
java.runtime.version = 1.8.0_91-8u91-b14-0ubuntu4〜16.04.1-b14
user.dir = / usr / lib / hive / apache-hive-2.1.0-bin / bin
os.name = Linux
os.arch = amd64
os.version = 4.4.0-28-generic
derby。 system.home = null
启动数据库类加载器 - derby.database.classpath =''

然后......没什么,它停在那里。根据教程,我现在应该有配置单元提示符( hive> ),但我不这样做,我尝试了一些配置单元命令,它们不起作用。我没有经典的CLI提示符,没有任何提示,我可以输入内容但我无法执行任何操作。看来我唯一能做的就是用CTRL + C来停止它。



任何想法有什么不对?

谢谢。

< hr>

编辑1:

关注 ://ubuntuforums.org/showthread.php?t=2254028rel =nofollow noreferrer>这里,并做了以下工作:

  sudo addgroup hive 
sudo useradd -g hive hive $ b $ sudo adduser hive sudo $ b $ sudo mkdir / home / hive
sudo chown -R hive:hive / home / hive
sudo chown -R hive:hive / usr / lib / hive /
visudo

将这一行添加到sudoers文件中:

  hive ALL =(ALL)NOPASSWD:ALL 

然后回到CLI:

  sudo su hive 
hive

我仍然得到相同的p不过,这是一个很好的例子。






编辑2: b
$ b

按照的建议,我现在得到一个不同的错误。错误输出很长,我觉得复制所有内容可能没有用,因为其他错误可能来自第一个错误,所以这里是开始:

  SLF4J:类路径包含多个SLF4J绑定。 
SLF4J:在[jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/ org / slf4j / impl / StaticLoggerBinder.class]
SLF4J:在[jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/ org / slf4j / impl / StaticLoggerBinder.class]
SLF4J:请参阅http://www.slf4j.org/codes.html#multiple_bindings以获取解释。
SLF4J:实际绑定类型为[org.apache.logging.slf4j.Log4jLoggerFactory] ​​

使用jar中的配置初始化日志:file:/ usr / lib / hive / apache-hive- 2.1.0-bin / lib / hive-common-2.1.0.jar!/hive-log4j2.properties Async:true
Mon Jul 18 18:03:44 CEST 2016 Thread [main,5,main] java .io.FileNotFoundException:derby.log(Permission denied)
线程main中的异常java.lang.RuntimeException:org.apache.hadoop.hive.ql.metadata.HiveException:java.lang.RuntimeException:Unable to在org.apache上实例化org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
在org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:578)
。 hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
at org。 apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(本地方法)
at sun.r eflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java: 498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
导致:org.apache.hadoop.hive.ql.metadata.HiveException:java.lang.RuntimeException:无法在org.apache实例化org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
。 hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
at org.apache.hadoop.hive.ql.metadata.Hive。< init>(Hive.java:366)
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java: 290)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
at org.apache.hadoop.hive.ql.session.SessionState.start(硒ssionState.java:545)
... 9 more

请告诉我,如果你想其余的错误日志。

解决方案

使用的实际SLF4J绑定是Log4j2。为此,需要在类路径上匹配log4j-api和log4j-core依赖项。您还需要在类路径中使用log4j2.xml配置,因为默认情况下只会将ERROR消息打印到控制台。 Log4j2手册有很多示例配置。

您可能还想从类路径中删除slf4j-log4j12-1.7.10.jar。

I've installed Hadoop, Spark, R, Rstudio-server and SparkR, and I'm now trying to install Hive.

Following tutorials on the internet, here's what I did :

$ cd /home/francois-ubuntu/media/
$ mkdir install-hive
$ cd install-hive
$ wget http://mirrors.ircam.fr/pub/apache/hive/hive-2.1.0/apache-hive-2.1.0-bin.tar.gz
$ tar -xzvf apache-hive-2.1.0-bin.tar.gz
$ mkdir /usr/lib/hive
$ mv apache-hive-2.1.0-bin /usr/lib/hive
$ cd
$ rm -rf /home/francois-ubuntu/media/install-hive
$ sudo vim ~/.bashrc

In .bashrc, I wrote the following (I'm also including the lines relative to Java, Hadoop and Spark, maybe it can be helpful) :

# Set JAVA_HOME
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

# Set HADOOP_HOME
alias hadoop=/usr/local/hadoop/bin/hadoop
export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin

# Set SPARK_HOME
export SPARK_HOME=/usr/local/spark

# Set HIVE_HOME
export HIVE_HOME=/usr/lib/hive/apache-hive-2.1.0-bin
PATH=$PATH:$HIVE_HOME/bin
export PATH

Back to the CLI :

$ cd /usr/lib/hive/apache-hive-2.1.0-bin/bin
$ sudo vim hive-config.sh

In hive-config.sh, I add :

export HADOOP_HOME=/usr/local/hadoop

Then :wq, back to the CLI :

$ hadoop fs -mkdir /usr/hive/warehouse
$ hadoop fs -chmod g+w /usr/hive/warehouse

And then finally :

$ hive

Here is what I get :

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Mon Jul 18 12:13:44 CEST 2016 Thread[main,5,main] java.io.FileNotFoundException: derby.log (Permission denied)
----------------------------------------------------------------
Mon Jul 18 12:13:45 CEST 2016:
Booting Derby (version The Apache Software Foundation - Apache Derby - 10.10.2.0 - (1582446)) instance a816c00e-0155-fd7f-479a-0000040c9aa0 
on database directory /usr/lib/hive/apache-hive-2.1.0-bin/bin/metastore_db in READ ONLY mode with class loader sun.misc.Launcher$AppClassLoader@2e5c649. 
Loaded from file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/derby-10.10.2.0.jar.
java.vendor=Oracle Corporation
java.runtime.version=1.8.0_91-8u91-b14-0ubuntu4~16.04.1-b14
user.dir=/usr/lib/hive/apache-hive-2.1.0-bin/bin
os.name=Linux
os.arch=amd64
os.version=4.4.0-28-generic
derby.system.home=null
Database Class Loader started - derby.database.classpath=''

And then... nothing, it stops there. According to the tutorials, I should have the hive prompt (hive>) at this point, but I don't, I tried some hive commands, they don't work. I don't have the classic CLI prompt either, no prompt, I can type stuff but I can't execute anything. It seems the only thing I can do is stop it with CTRL+C.

Any idea what's wrong ?

Thanks.


Edit 1 :

Following this advice from @Hawknight, I followed the help given here, and did the following :

sudo addgroup hive
sudo useradd -g hive hive
sudo adduser hive sudo
sudo mkdir /home/hive
sudo chown -R hive:hive /home/hive
sudo chown -R hive:hive /usr/lib/hive/
visudo

Added this line to sudoers file:

hive ALL=(ALL) NOPASSWD:ALL

And then, back to CLI :

sudo su hive
hive

I still get the same problem, though.


Edit 2 :

Followed the advice from here, I now get a different error. The error output is very long, I feel like it might not be useful to copy everything since the other errors probably originate from the first one, so here is the beginning :

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/usr/lib/hive/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: true
Mon Jul 18 18:03:44 CEST 2016 Thread[main,5,main] java.io.FileNotFoundException: derby.log (Permission denied)
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:578)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:518)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)
    at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:545)
    ... 9 more

Please tell me if you want the rest of the error log.

解决方案

The actual SLF4J binding used is Log4j2. For this to work, you need the matching log4j-api and log4j-core dependencies on the classpath. You also need a log4j2.xml configuration in the classpath since the default will only print ERROR messages to the console. The Log4j2 manual has many example configurations.

You may also want to remove slf4j-log4j12-1.7.10.jar from your classpath.

这篇关于在Ubuntu上安装配置单元(德比问题?)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆