Spark SQL在HDP中的Hive中找不到表 [英] Spark sql can't find table in hive in HDP

查看:151
本文介绍了Spark SQL在HDP中的Hive中找不到表的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用HDP3.1,并添加了所需的Spark2,Hive和其他服务.我打开了Hive中的ACID功能.spark作业无法在配置单元中找到表格.但是该表存在于Hive中.异常喜欢:org.apache.spark.sql.AnalysisException:找不到表或视图Spark的conf文件夹中有hive-site.xml.它是由HDP自动创建的.但这与hive的conf文件夹中的文件不同.并且从日志中,火花可以正确获取配置单元的节俭URI.我使用spark sql并在spark-shell中创建了一个配置单元表.我发现该表是在spark.sql.warehouse.dir指定的折叠中创建的.我将其值更改为hive.metastore.warehouse.dir的值.但是问题仍然存在.在创建Spark会话时,我还启用了蜂巢支持.

I use HDP3.1 and I added Spark2, Hive and Other services which are needed. I turned of the ACID feature in Hive. The spark job can't find the table in hive. But the table exists in Hive. The exception likes: org.apache.spark.sql.AnalysisException: Table or view not found There is hive-site.xml in Spark's conf folder. It is automaticly created by HDP. But it isn't same as the file in hive's conf folder. And from the log, the spark can get the thrift URI of hive correctly. I use spark sql and created one hive table in spark-shell. I found the table was created in the fold which is specified by spark.sql.warehouse.dir. I changed its value to the value of hive.metastore.warehouse.dir. But the problem is still there. I also enabled hive support when creating spark session.

val ss = SparkSession.builder().appName("统计").enableHiveSupport().getOrCreate()

推荐答案

您可以使用hivewarehouse连接器,并在hive conf中使用llap

You can use hivewarehouse connector and use llap in hive conf

这篇关于Spark SQL在HDP中的Hive中找不到表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆