从Spark hivecontext查询会锁定蜂巢表吗? [英] Will query from Spark hivecontext lock the hive table?
问题描述
我知道如果我从Hive提交查询,则会获取一个共享锁,然后该Hive表将被查询锁定: https://cwiki.apache.org/confluence/display/Hive/Locking
I know if I submit the query from Hive,a shared lock will be acquired and then the hive table will get locked by the query: https://cwiki.apache.org/confluence/display/Hive/Locking
因此,我只是想知道查询是否由Spark Hivecontext执行,是否需要锁定并且表也将被锁定?另外,如果我通过Spark Hivecontext将数据插入表中,是否需要排他锁?
So I just wonder if the query is executed by Spark Hivecontext, will the lock required and will the table get locked as well? Also, if I insert the data into table through Spark Hivecontext, will it require a exclusive lock?
谢谢
推荐答案
Spark SQL v.1.6支持此功能,而2.x和3.x版本不支持.
It was supported in Spark SQL v.1.6, and it is not supported in 2.x and 3.x versions.
unsupportedHiveNativeCommands
...
| kw1=LOCK kw2=TABLE
| kw1=LOCK kw2=DATABASE
| kw1=UNLOCK kw2=TABLE
| kw1=UNLOCK kw2=DATABASE
这篇关于从Spark hivecontext查询会锁定蜂巢表吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!