如何访问多处理进程中调用的函数的结果? [英] How to access the result of a function called in a multiprocessing process?

查看:132
本文介绍了如何访问多处理进程中调用的函数的结果?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

 导入随机
导入多重处理
导入时间

def list_append(count,id):
out_list = []
我在范围内(count):
out_list.append(random.random())
返回out_list


if __name__ ==__main__:
t0 = time.clock()
size = 10000000#要添加的随机数的数量
procs = 2#创建进程的数量

作业= [] $ b $为我在范围内(0,procs):
process = multiprocessing.Process(target = list_append ,args =(size,i))
jobs.append(进程)

#启动进程(即计算随机数列表)
res = []
for j in jobs:
r = j.start()
res.append(r)

#确保所有进程已完成
作业:
j.join()

print列表处理完成。
print time.clock() - t0,秒

不幸的是,最后它的 res = [None,None] 尽管我希望它被填充在列表中的列表填充了 list_append

解决方案

您需要使用可以在进程之间共享的数据结构

  def list_append(count,id,res):
#^^^
out_list = []
在范围内(count):
out_list.append(random .random())
res [id] = out_list#< ------

if __name__ ==__main__:
size = 10000000
procs = 2
manager = multiprocessing.Manager()#< ---
res = manager.dict()#< ---
jobs = []
对于我在范围内(0,procs):
process = multiprocessing.Process(target = list_append,args =(size,i,res))
#^^^^
jobs.append (进程)

for j in jobs:
r = j.start()

for j in jobs:
j.join()

print列表处理完成。
#现在`res`将包含结果

正如avenet所评论的,使用 multiprocessing.Pool 会更简单:

  def list_append(args):
count,id = args
out_list = []
给我的范围(count):
out_list.append(random.random())
返回out_list

if __name__ ==__main__:
size = 10000000
procs = 2

pool = multiprocessing.Pool(procs)
res = pool.map(list_append,[(size,i)for i在范围内(procs)])
pool.close()
pool.join()

print列表处理完成。
#print res


I am runnning this code :

import random
import multiprocessing
import time

def list_append(count, id):
    out_list = []
    for i in range(count):
        out_list.append(random.random())
    return out_list


if __name__ == "__main__":
    t0 = time.clock()
    size = 10000000   # Number of random numbers to add
    procs = 2   # Number of processes to create

    jobs = []
    for i in range(0, procs):
        process = multiprocessing.Process(target=list_append,args=(size, i))                                        
        jobs.append(process)

# Start the processes (i.e. calculate the random number lists)  
    res=[]  
    for j in jobs:
        r= j.start()
        res.append(r)

# Ensure all of the processes have finished
    for j in jobs:
        j.join()

    print "List processing complete."
    print time.clock()-t0,"seconds"

Unfortunately, at the end of it, res = [None,None] although I want it to be filled with the lists I've filled in the function list_append.

解决方案

You need to use data structures that can be shared between processes:

def list_append(count, id, res):
    #                      ^^^
    out_list = []
    for i in range(count):
        out_list.append(random.random())
    res[id] = out_list  # <------

if __name__ == "__main__":
    size = 10000000
    procs = 2   
    manager = multiprocessing.Manager()  # <---
    res = manager.dict()                 # <---
    jobs = []
    for i in range(0, procs):
        process = multiprocessing.Process(target=list_append,args=(size, i, res))
        #                                                                   ^^^^
        jobs.append(process)

    for j in jobs:
        r = j.start()

    for j in jobs:
        j.join()

    print "List processing complete."
    # now `res` will contain results

As avenet commented, using multiprocessing.Pool will be simpler:

def list_append(args):
    count, id = args
    out_list = []
    for i in range(count):
        out_list.append(random.random())
    return out_list

if __name__ == "__main__":
    size = 10000000
    procs = 2

    pool = multiprocessing.Pool(procs)
    res = pool.map(list_append, [(size, i) for i in range(procs)])
    pool.close()
    pool.join()

    print "List processing complete."
    # print res

这篇关于如何访问多处理进程中调用的函数的结果?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆