从远程Windows系统使用jdbc连接到kerberised配置单元 [英] Connect to kerberised hive using jdbc from remote windows system

查看:156
本文介绍了从远程Windows系统使用jdbc连接到kerberised配置单元的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经在Linux服务器(Red Hat)上启用了启用了Kerberos安全性的配置单元环境。我需要使用JDBC从远程Windows机器连接到配置单元。



因此,我在linux机器上运行hiveserver2,并且完成了kinit。



现在我尝试从Windows程序的一个java程序连接到这样的测试程序,

 的Class.forName( org.apache.hive.jdbc.HiveDriver); 
String url =jdbc:hive2://< host>:10000 / default; principal = hive / _HOST @< YOUR-REALM.COM>
Connection con = DriverManager.getConnection(url);

我得到以下错误:

 由于以下原因导致的异常:无法使用JDBC打开客户端传输Uri:
jdbc:hive2://< host>:10000 /; principal = hive / _HOST @ YOUR-REALM .COM>:
GSS启动失败

我在这里做了什么错?我查了很多论坛,但无法得到适当的解决方案。任何答案将不胜感激。



谢谢 解决方案

在Linux中运行你的代码,我会直接指向定义Kerberos和JAAS配置,从具有特定格式的conf文件中删除。

和/您必须切换调试跟踪标志以了解subtile配置问题(例如,不同版本的JVM可能有不同的语法要求,这些都没有记录,这是一个试错过程)。 / p>

但Windows上的 存在其他问题:


  1. Apache Hive JDBC驱动程序对Hadoop JAR有一些依赖性,特别是涉及到Kerberos时(请参阅 $ / $> $ / $> $ / $> $ / $>

  2. 这些Hadoop JAR需要本地库 - 即Hadoop的Windows端口(您必须自己编译!或从网上的不安全源下载!!) - 加上系统属性 hadoop.home.dir java.library。路径分别指向Hadoop主目录及其 bin 子目录

最重要的是,Apache Hive驱动程序存在兼容性问题 - 每当有线协议发生变化时,新客户端无法连接到较旧的服务器。



因此,我强烈建议您为 Windows客户端使用 Cloudera JDBC驱动程序for Hive 。 Cloudera网站只是询问您的电子邮件。

此后,您需要阅读80多页PDF手册,将JAR添加到您的CLASSPATH中,并根据手册对您的JDBC URL进行修改。< br>
注意:Cloudera驱动程序是一个合适的JDBC-4.x兼容驱动程序,不需要传统的 Class.forName()。 ..


I have setup a hive environment with Kerberos security enabled on a Linux server (Red Hat). And I need to connect from a remote windows machine to hive using JDBC.

So, I have hiveserver2 running in the linux machine, and I have done "kinit".

Now I try to connect from a java program on the windows side with a test program like this,

Class.forName("org.apache.hive.jdbc.HiveDriver");
String url = "jdbc:hive2://<host>:10000/default;principal=hive/_HOST@<YOUR-REALM.COM>"
Connection con = DriverManager.getConnection(url);

And I got the following error,

Exception due to: Could not open client transport with JDBC Uri:
 jdbc:hive2://<host>:10000/;principal=hive/_HOST@YOUR-REALM.COM>: 
GSS initiate failed

What am I doing here wrong ? I checked many forums, but couldn't get a proper solution. Any answer will be appreciated.

Thanks

解决方案

If you were running your code in Linux, I would simply point to that post -- i.e. you must use System properties to define Kerberos and JAAS configuration, from conf files with specific formats.
And you have to switch the debug trace flags to understand subtile configuration issue (i.e. different flavors/versions of JVMs may have different syntax requirements, which are not documented, it's a trial-and-error process).

But on Windows there are additional problems:

  1. the Apache Hive JDBC driver has some dependencies on Hadoop JARs, especially when Kerberos is involved (see that post for details)
  2. these Hadoop JARs require "native libraries" -- i.e. a Windows port of Hadoop (which you have to compile yourself!! or download from an insecure source on the web!!) -- plus System properties hadoop.home.dir and java.library.path pointing to the Hadoop home dir and its bin sub-dir respectively

On the top of that, the Apache Hive driver has compatibility issues -- whenever there are changes in the wire protocol, newer clients cannot connect to older servers.

So I strongly advise you to use the Cloudera JDBC driver for Hive for your Windows clients. The Cloudera site just asks your e-mail.
After that you have a 80+ pages PDF manual to read, the JARs to add to your CLASSPATH, and your JDBC URL to adapt according to the manual.
Side note: the Cloudera driver is a proper JDBC-4.x compliant driver, no need for that legacy Class.forName()...

这篇关于从远程Windows系统使用jdbc连接到kerberised配置单元的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆