全局变量和 Python 多处理 [英] Globals variables and Python multiprocessing

查看:25
本文介绍了全局变量和 Python 多处理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

可能重复:
Python 多处理全局变量更新未返回给父级

我正在使用具有多个内核的计算机,为了提高性能,我真的应该使用多个内核.但是,我很困惑为什么这些代码没有达到我的预期:

I am using a computer with many cores and for performance benefits I should really use more than one. However, I'm confused why these bits of code don't do what I expect:

from multiprocessing import Process

var = range(5)
def test_func(i):
    global var
    var[i] += 1

if __name__ == '__main__':
    jobs = []
    for i in xrange(5):
        p = Process(target=test_func,args=(i,))
        jobs.append(p)
        p.start()

print var

还有

from multiprocessing import Pool

var = range(5)
def test_func(i):
    global var
    var[i] += 1

if __name__ == '__main__':
    p = Pool()
    for i in xrange(5):
        p.apply_async(test_func,[i])

print var

我希望结果是 [1, 2, 3, 4, 5] 但结果是 [0, 1, 2, 3, 4].

I expect the result to be [1, 2, 3, 4, 5] but the result is [0, 1, 2, 3, 4].

在将全局变量与进程一起使用时,我肯定遗漏了一些微妙之处.这甚至是要走的路还是我应该避免尝试以这种方式更改变量?

There must be some subtlety I'm missing in using global variables with processes. Is this even the way to go or should I avoid trying to change a variable in this manner?

推荐答案

如果您正在运行两个单独的进程,那么它们将不会共享相同的全局变量.如果要在进程之间传递数据,请查看使用 send 和 recv.看看 http://docs.python.org/library/multiprocessing.html#shared-state-between-processes 举一个与您正在做的类似的例子.

If you are running two separate processes, then they won't be sharing the same globals. If you want to pass the data between the processes, look at using send and recv. Take a look at http://docs.python.org/library/multiprocessing.html#sharing-state-between-processes for an example similar to what you're doing.

这篇关于全局变量和 Python 多处理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆