如何停止SparkContext [英] How to stop SparkContext

查看:258
本文介绍了如何停止SparkContext的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何停止当前正在运行的任何火花上下文.

How to stop any spark context running currently.

信息API:ScalaSpark版本:Spark 2.3

Information API : Scala Spark version : Spark 2.3

实际上,我已经创建了火花上下文.为了阻止他们,我应该输入instance.stop(),但不记得spark上下文的实例名称.那么如何停止运行spark上下文呢?

Actually I have created spark contexts. In order to stop them I should enter e.g. instance.stop() but couldn't remember the instance name of spark context. So how can I stop running spark context?

还是有什么方法可以重置我在spark-shell中所做的一切并从头开始?

Or is there any method to reset everything I have done in spark-shell and start from first?

推荐答案

这并不能真正回答您的问题,但将来可能有助于避免此问题.

This doesn't really answer your question, but it may help prevent this issue in the future.

您可以(应该?)使用 atexit 模块来确保在退出python时自动调用spark上下文的 .stop().

You can (should?) use the atexit module to ensure that your spark context's .stop() gets called automatically when you exit python.

import atexit

#spark configuration code here
sc = SparkContext(conf = conf)

atexit.register(lambda: sc.stop())

这篇关于如何停止SparkContext的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆