从Apache SQL Spark删除临时表 [英] Remove Temporary Tables from Apache SQL Spark
问题描述
我使用下面的Zeppelin
在Apache Spark
中有registertemptable
:
val hvacText = sc.textFile("...")
case class Hvac(date: String, time: String, targettemp: Integer, actualtemp: Integer, buildingID: String)
val hvac = hvacText.map(s => s.split(",")).filter(s => s(0) != "Date").map(
s => Hvac(s(0),
s(1),
s(2).toInt,
s(3).toInt,
s(6))).toDF()
hvac.registerTempTable("hvac")
使用此临时表完成查询后,如何将其删除?
After I have done with my queries with this temp table, how do I remove it ?
我检查了所有文档,但似乎一无所获.
I checked all docs and it seems I am getting nowhere.
任何指导吗?
推荐答案
Spark 2.x
For temporary views you can use Catalog.dropTempView
:
spark.catalog.dropTempView("df")
For global views you can use Catalog.dropGlobalTempView
:
spark.catalog.dropGlobalTempView("df")
如果view不存在,则可以安全地调用这两个方法,并且从Spark 2.1开始,返回boolean指示操作是否成功.
Both methods are safe to call if view doesn't exist and, since Spark 2.1, return boolean indicating if the operation succeed.
火花1.x
您可以使用 SQLContext.dropTempTable
:
You can use SQLContext.dropTempTable
:
scala.util.Try(sqlContext.dropTempTable("df"))
它仍然可以在Spark 2.0中使用,但是将处理委托给Catalog.dropTempView
,如果表不存在,则可以安全使用.
It can be still used in Spark 2.0, but delegates processing to Catalog.dropTempView
and is safe to use if table doesn't exist.
这篇关于从Apache SQL Spark删除临时表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!