全局变量和Python多处理 [英] Globals variables and Python multiprocessing

查看:209
本文介绍了全局变量和Python多处理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述


可能存在重复:

Python多处理全局变量更新不返回给父级

我使用的计算机有很多核心,为了提高性能,我应该使用多个核心。然而,我很困惑为什么这些代码不能达到我所期望的效果:

I am using a computer with many cores and for performance benefits I should really use more than one. However, I'm confused why these bits of code don't do what I expect:

from multiprocessing import Process

var = range(5)
def test_func(i):
    global var
    var[i] += 1

if __name__ == '__main__':
    jobs = []
    for i in xrange(5):
        p = Process(target=test_func,args=(i,))
        jobs.append(p)
        p.start()

print var

以及来自多处理的

As well as

from multiprocessing import Pool

var = range(5)
def test_func(i):
    global var
    var[i] += 1

if __name__ == '__main__':
    p = Pool()
    for i in xrange(5):
        p.apply_async(test_func,[i])

print var

我预计结果是 [1,2,3,4,5] ,但结果是 [0,1,2,3,4]

I expect the result to be [1, 2, 3, 4, 5] but the result is [0, 1, 2, 3, 4].

我在使用glob时一定有一些微妙之处具有过程的变量。这是甚至要走,或者我应该避免试图以这种方式更改变量?

There must be some subtlety I'm missing in using global variables with processes. Is this even the way to go or should I avoid trying to change a variable in this manner?

推荐答案

如果您运行两单独的进程,那么他们将不会共享相同的全局变量。如果你想在这些进程之间传递数据,请看使用send和recv。查看 http://docs.python.org/library/multiprocessing.html#分享进程之间的状态,这个例子与您正在做的类似。

If you are running two separate processes, then they won't be sharing the same globals. If you want to pass the data between the processes, look at using send and recv. Take a look at http://docs.python.org/library/multiprocessing.html#sharing-state-between-processes for an example similar to what you're doing.

这篇关于全局变量和Python多处理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆