如果未关闭SparkSession,会发生什么? [英] What happens if SparkSession is not closed?

查看:1178
本文介绍了如果未关闭SparkSession,会发生什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

以下2个有什么区别?

object Example1 {
    def main(args: Array[String]): Unit = {
        try {
            val spark = SparkSession.builder.getOrCreate
            // spark code here
        } finally {
            spark.close
        }
    }
}

object Example2 {
    val spark = SparkSession.builder.getOrCreate
    def main(args: Array[String]): Unit = {
        // spark code here
    }
}    

我知道SparkSession实现Closeable,它暗示需要关闭.但是,如果SparkSession是像Example2中那样创建并且从未直接关闭的,我不会想到任何问题.

I know that SparkSession implements Closeable and it hints that it needs to be closed. However, I can't think of any issues if the SparkSession is just created as in Example2 and never closed directly.

如果Spark应用程序成功或失败(并从main方法退出),则JVM将终止并且SparkSession将随之消失.这是正确的吗?

In case of success or failure of the Spark application (and exit from main method), the JVM will terminate and the SparkSession will be gone with it. Is this correct?

IMO:SparkSession是单例的事实也不应该有很大的不同.

IMO: The fact that the SparkSession is a singleton should not make a big difference either.

推荐答案

使用完毕,您应该始终关闭SparkSession(即使最终结果只是 ,以遵循将您所获得的东西还给您的良好做法.

You should always close your SparkSession when you are done with its use (even if the final outcome were just to follow a good practice of giving back what you've been given).

关闭SparkSession可能会触发释放可以提供给某些其他应用程序的群集资源.

Closing a SparkSession may trigger freeing cluster resources that could be given to some other application.

SparkSession是一个会话,因此维护一些消耗JVM内存的资源.您可以根据需要设置任意数量的SparkSession(请参见 SparkSession.newSession 来重新创建会话),但是您不希望他们使用内存,如果您不使用内存,那么就不应该使用它们,因此close您不再需要的内存.

SparkSession is a session and as such maintains some resources that consume JVM memory. You can have as many SparkSessions as you want (see SparkSession.newSession to create a session afresh) but you don't want them to use memory they should not if you don't use one and hence close the one you no longer need.

SparkSession是Spark SQL围绕Spark Core的 SparkContext ,因此在幕后(如在任何Spark应用程序中一样),您将拥有群集资源,即vcore和内存(通过SparkContext分配给了您的SparkSession).这意味着只要您正在使用SparkContext(使用SparkSession),群集资源就不会分配给其他任务(不一定是Spark的任务,也不会分配给提交给群集的其他非Spark应用程序).这些群集资源是您的,直到您说我完成了",这才翻译成... close.

SparkSession is Spark SQL's wrapper around Spark Core's SparkContext and so under the covers (as in any Spark application) you'd have cluster resources, i.e. vcores and memory, assigned to your SparkSession (through SparkContext). That means that as long as your SparkContext is in use (using SparkSession) the cluster resources won't be assigned to other tasks (not necessarily Spark's but also for other non-Spark applications submitted to the cluster). These cluster resources are yours until you say "I'm done" which translates to...close.

但是,如果在close之后仅退出Spark应用程序,则不必考虑执行close,因为无论如何资源都会自动关闭.驱动程序和执行程序的JVM终止,并且与集群的(心跳)连接也终止,因此最终将资源返回给集群管理器,以便它可以将其提供给其他应用程序使用.

If however, after close, you simply exit a Spark application, you don't have to think about executing close since the resources will be closed automatically anyway. The JVMs for the driver and executors terminate and so does the (heartbeat) connection to the cluster and so eventually the resources are given back to the cluster manager so it can offer them to use by some other application.

这篇关于如果未关闭SparkSession,会发生什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆