Spark 2.1-实例化HiveSessionState时出错 [英] Spark 2.1 - Error While instantiating HiveSessionState

查看:617
本文介绍了Spark 2.1-实例化HiveSessionState时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

通过全新安装Spark 2.1,执行pyspark命令时出现错误.

With a fresh install of Spark 2.1, I am getting an error when executing the pyspark command.

Traceback (most recent call last):
File "/usr/local/spark/python/pyspark/shell.py", line 43, in <module>
spark = SparkSession.builder\
File "/usr/local/spark/python/pyspark/sql/session.py", line 179, in getOrCreate
session._jsparkSession.sessionState().conf().setConfString(key, value)
File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
File "/usr/local/spark/python/pyspark/sql/utils.py", line 79, in deco
raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':"

我在同一台计算机上拥有Hadoop和Hive. Hive配置为将MySQL用于元存储. Spark 2.0.2.没有出现此错误.

I have Hadoop and Hive on the same machine. Hive is configured to use MySQL for the metastore. I did not get this error with Spark 2.0.2.

有人可以指出我正确的方向吗?

Can someone please point me in the right direction?

推荐答案

我遇到了同样的问题.某些答案sudo chmod -R 777 /tmp/hive/或将hadoop的Spark降级到2.6都不适合我. 我意识到对我造成此问题的原因是我正在使用sqlContext而不是sparkSession进行SQL查询.

I had the same problem. Some of the answers sudo chmod -R 777 /tmp/hive/, or to downgrade spark with hadoop to 2.6 didn't work for me. I realized that what caused this problem for me is that I was doing SQL queries using the sqlContext instead of using the sparkSession.

sparkSession =SparkSession.builder.master("local[*]").appName("appName").config("spark.sql.warehouse.dir", "./spark-warehouse").getOrCreate()
sqlCtx.registerDataFrameAsTable(..)
df = sparkSession.sql("SELECT ...")

这对我现在完全有效.

this perfectly works for me now.

这篇关于Spark 2.1-实例化HiveSessionState时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆