客户关闭后,Dask分布式获取期货 [英] Dask Distributed Getting Futures after Client Closed

查看:71
本文介绍了客户关闭后,Dask分布式获取期货的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

无论如何,有没有防止dask/distributed取消已排队& ;;当客户关闭时执行期货吗?

Is there anyway to prevent dask/distributed from cancelling queued & executing futures when the client is closed?

我想使用jupyter笔记本启动一些非常长时间的分布式仿真,关闭笔记本,然后稍后再获取结果.

I want to use a jupyter notebook to kick off some very long running simulations with distributed, close the notebook, and sometime later, retrieve the results.

推荐答案

您可以使用发布"机制在调度程序中保留对某些数据的引用,以便以后在另一个客户端中检索.存在两种形式,它们使用不同的语法执行相同的操作:

You can use the "publish" mechanism to keep references to some data around in the scheduler for later retrieval in another client. Two forms exist which do the same thing with different syntax:

>>> client.publish_dataset(mydata=f)

此处 f 是期货,期货列表或较早的集合(数据框等).

Here f is a future, list of futures or a dask collection (dataframe, etc).

在另一个会话中:

>>> client.list_datasets()
['mydata']
>>> client.get_dataset('mydata')
<same thing as f>

另一种可能更简单的语法看起来像

The alternative and maybe simpler syntax looks like

>>> client.datasets['mydata'] = f

>>> list(client.datasets)
['mydata']
>>> client.datasets['mydata']
<same thing as f>

要删除引用并允许从集群中清除数据(如果没有客户端需要它们),请使用 client.unpublish_dataset('mydata') del client.datasets ['mydata'] .

To remove the references and allow the data to be cleared from the cluster (if no client needs them), use client.unpublish_dataset('mydata') or del client.datasets['mydata'].

这篇关于客户关闭后,Dask分布式获取期货的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆