从 pool.map 进程返回多个列表? [英] Returning multiple lists from pool.map processes?

查看:50
本文介绍了从 pool.map 进程返回多个列表?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Win 7、x64、Python 2.7.12

Win 7, x64, Python 2.7.12

在下面的代码中,我将通过 multiprocessing.Pool.map() 方法设置一些池进程来进行简单的乘法运算.输出数据收集在 List_1 中.

In the following code I am setting off some pool processes to do a trivial multiplication via the multiprocessing.Pool.map() method. The output data is collected in List_1.

注意:这是对我的实际代码的精简简化.实际应用中涉及多个列表,个个都很大.

NOTE: this is a stripped down simplification of my actual code. There are multiple lists involved in the real application, all huge.

import multiprocessing
import numpy as np

def createLists(branches):

    firstList = branches[:] * node

    return firstList


def init_process(lNodes):

    global node
    node = lNodes
    print 'Starting', multiprocessing.current_process().name


if __name__ == '__main__':

    mgr = multiprocessing.Manager()
    nodes = mgr.list()
    pool_size = multiprocessing.cpu_count()

    branches = [i for i in range(1, 21)]
    lNodes = 10
    splitBranches = np.array_split(branches, int(len(branches)/pool_size))

    pool = multiprocessing.Pool(processes=pool_size, initializer=init_process, initargs=[lNodes])
    myList_1 = pool.map(createLists, splitBranches)

    pool.close() 
    pool.join()  

我现在向 createLists() 添加一个额外的计算 &尝试传回两个列表.

I now add an extra calculation to createLists() & try to pass back both lists.

import multiprocessing
import numpy as np

def createLists(branches):

    firstList = branches[:] * node
    secondList = branches[:] * node * 2

    return firstList, secondList


def init_process(lNodes):
    global node
    node = lNodes
    print 'Starting', multiprocessing.current_process().name


if __name__ == '__main__':

    mgr = multiprocessing.Manager()
    nodes = mgr.list()
    pool_size = multiprocessing.cpu_count()

    branches = [i for i in range(1, 21)]
    lNodes = 10
    splitBranches = np.array_split(branches, int(len(branches)/pool_size))

    pool = multiprocessing.Pool(processes=pool_size, initializer=init_process, initargs=[lNodes])
    myList_1, myList_2 = pool.map(createLists, splitBranches)

    pool.close() 
    pool.join() 

这会引发跟随错误 &回溯..

This raises the follow error & traceback..

Traceback (most recent call last):

  File "<ipython-input-6-ff188034c708>", line 1, in <module>
    runfile('C:/Users/nr16508/Local Documents/Inter Trab Angle/Parallel/scratchpad.py', wdir='C:/Users/nr16508/Local Documents/Inter Trab Angle/Parallel')

  File "C:\Users\nr16508\AppData\Local\Continuum\Anaconda2\lib\site-packages\spyder\utils\site\sitecustomize.py", line 866, in runfile
    execfile(filename, namespace)

  File "C:\Users\nr16508\AppData\Local\Continuum\Anaconda2\lib\site-packages\spyder\utils\site\sitecustomize.py", line 87, in execfile
    exec(compile(scripttext, filename, 'exec'), glob, loc)

  File "C:/Users/nr16508/Local Documents/Inter Trab Angle/Parallel/scratchpad.py", line 36, in <module>
    myList_1, myList_2 = pool.map(createLists, splitBranches)

ValueError: too many values to unpack

当我试图将两个列表合二为一以传回即...

When I tried to put both list into one to pass back ie...

return [firstList, secondList]
......
myList = pool.map(createLists, splitBranches)

...输出变得过于混乱,无法进一步处理.

...the output becomes too jumbled for further processing.

是否有一种方法可以从池化进程中收集多个列表?

Is there an method of collecting more than one list from pooled processes?

推荐答案

这个问题与多处理或线程池无关.它只是关于如何解压缩列表,这可以使用标准的 zip(*...) 习惯用法来完成.

This question has nothing to do with multiprocessing or threadpooling. It is simply about how to unzip lists, which can be done with the standard zip(*...) idiom.

myList_1, myList_2 = zip(*pool.map(createLists, splitBranches))

这篇关于从 pool.map 进程返回多个列表?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆