如何通过Spark属性(Spark 1.6)在spark-shell中启用或禁用Hive支持? [英] How to enable or disable Hive support in spark-shell through Spark property (Spark 1.6)?
问题描述
是否有任何配置属性,我们可以将其设置为通过spark 1.6中的spark-shell显式禁用/启用Hive支持.我试图使用所有的sqlContext配置属性,
Is there any configuration property we can set it to disable / enable Hive support through spark-shell explicitly in spark 1.6. I tried to get all the sqlContext configuration properties with,
sqlContext.getAllConfs.foreach(println)
但是,我不确定禁用/启用配置单元支持实际上需要哪个属性.还是有其他方法可以做到这一点?
But, I am not sure on which property can actually required to disable/enable hive support. or Is there any other way to do this?
推荐答案
火花> = 2.0
通过配置可以启用和禁用Hive上下文
spark.sql.catalogImplementation
Spark >= 2.0
Enable and disable of Hive context is possible with config
spark.sql.catalogImplementation
spark.sql.catalogImplementation
的可能值是
内存中 或 配置单元
Possible values for spark.sql.catalogImplementation
is
in-memory or hive
SPARK-16013添加了用于在spark-shell/pyspark中禁用HiveContext的选项
火花< 2.0
在Spark 1.6中不能使用这样的Spark属性.
Spark < 2.0
Such a Spark property is not available in Spark 1.6.
一种解决方法是删除与Hive相关的jar,这将继而禁用Spark中的Hive支持(因为在需要的Hive类可用时,Spark具有Hive支持).
One way to work it around is to remove Hive-related jars that would in turn disable Hive support in Spark (as Spark has Hive support when required Hive classes are available).
这篇关于如何通过Spark属性(Spark 1.6)在spark-shell中启用或禁用Hive支持?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!