如何在没有hive-site.xml的情况下将Spark SQL连接到远程Hive Metastore(通过节俭协议)? [英] How to connect Spark SQL to remote Hive metastore (via thrift protocol) with no hive-site.xml?
问题描述
我正在将HiveContext与SparkSQL结合使用,并且尝试连接到远程Hive Metastore,设置Hive Metastore的唯一方法是通过在类路径中包含hive-site.xml(或将其复制到/etc中)/spark/conf/).
I'm using HiveContext with SparkSQL and I'm trying to connect to a remote Hive metastore, the only way to set the hive metastore is through including the hive-site.xml on the classpath (or copying it to /etc/spark/conf/).
是否可以在不包含hive-site.xml的情况下用Java代码以编程方式设置此参数?如果是这样,应该使用什么Spark配置?
Is there a way to set this parameter programmatically in a java code without including the hive-site.xml ? If so what is the Spark configuration to use ?
推荐答案
对于Spark 1.x,您可以使用设置:
For Spark 1.x, you can set with :
System.setProperty("hive.metastore.uris", "thrift://METASTORE:9083");
final SparkConf conf = new SparkConf();
SparkContext sc = new SparkContext(conf);
HiveContext hiveContext = new HiveContext(sc);
或
final SparkConf conf = new SparkConf();
SparkContext sc = new SparkContext(conf);
HiveContext hiveContext = new HiveContext(sc);
hiveContext.setConf("hive.metastore.uris", "thrift://METASTORE:9083");
更新您的配置单元是否为kerberized :
在创建HiveContext之前尝试设置它们:
Try setting these before creating the HiveContext :
System.setProperty("hive.metastore.sasl.enabled", "true");
System.setProperty("hive.security.authorization.enabled", "false");
System.setProperty("hive.metastore.kerberos.principal", hivePrincipal);
System.setProperty("hive.metastore.execute.setugi", "true");
这篇关于如何在没有hive-site.xml的情况下将Spark SQL连接到远程Hive Metastore(通过节俭协议)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!