如何通过启用 HiveSupport 在 windows10 上运行本地 spark-sql 程序时修复异常? [英] How to fix Exception while running locally spark-sql program on windows10 by enabling HiveSupport?
问题描述
我正在研究 SPARK-SQL 2.3.1
和我正在尝试在创建会话时启用 hiveSupport,如下所示
I am working on SPARK-SQL 2.3.1
and
I am trying to enable the hiveSupport for while creating a session as below
.enableHiveSupport()
.config("spark.sql.warehouse.dir", "c://tmp//hive")
我在命令下运行
C:\Software\hadoop\hadoop-2.7.1\bin>winutils.exe chmod 777 C:\tmp\hive
运行我的程序时:
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir:/tmp/hive on HDFS should be writable.当前权限为:rw-rw-rw-在 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
如何解决这个问题并运行我的本地 Windows 机器?
How to fix this issue and run my local windows machine?
推荐答案
尝试使用这个命令:
hadoop fs -chmod -R 777 /tmp/hive/
这是 Spark 异常,而不是 Windows.您需要为 HDFS 文件夹设置正确的权限,而不仅仅是本地目录.
This is Spark Exception, not Windows. You need to set correct permissions for the HDFS folder, not only for your local directory.
这篇关于如何通过启用 HiveSupport 在 windows10 上运行本地 spark-sql 程序时修复异常?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!