多重处理:如何在多个流程之间共享一个字典? [英] multiprocessing: How do I share a dict among multiple processes?

查看:58
本文介绍了多重处理:如何在多个流程之间共享一个字典?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

一个程序,该程序创建在可连接队列Q上工作的多个进程,并且最终可以操纵全局词典D来存储结果. (因此每个子进程可以使用D存储其结果,并查看其他子进程产生的结果)

A program that creates several processes that work on a join-able queue, Q, and may eventually manipulate a global dictionary D to store results. (so each child process may use D to store its result and also see what results the other child processes are producing)

如果我在子进程中打印字典D,我会看到对它进行的修改(即在D上).但是在主流程加入Q之后,如果我打印D,那就是空字典!

If I print the dictionary D in a child process, I see the modifications that have been done on it (i.e. on D). But after the main process joins Q, if I print D, it's an empty dict!

我了解这是一个同步/锁定问题.有人可以告诉我这里发生了什么以及如何同步对D的访问吗?

I understand it is a synchronization/lock issue. Can someone tell me what is happening here, and how I can synchronize access to D?

推荐答案

一个普遍的答案涉及使用 Manager 对象.从文档改编而成:

A general answer involves using a Manager object. Adapted from the docs:

from multiprocessing import Process, Manager

def f(d):
    d[1] += '1'
    d['2'] += 2

if __name__ == '__main__':
    manager = Manager()

    d = manager.dict()
    d[1] = '1'
    d['2'] = 2

    p1 = Process(target=f, args=(d,))
    p2 = Process(target=f, args=(d,))
    p1.start()
    p2.start()
    p1.join()
    p2.join()

    print d

输出:

$ python mul.py 
{1: '111', '2': 6}

这篇关于多重处理:如何在多个流程之间共享一个字典?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆