Python 2.6:使用multiprocessing.Pool时处理本地存储 [英] Python 2.6: Process local storage while using multiprocessing.Pool

查看:300
本文介绍了Python 2.6:使用multiprocessing.Pool时处理本地存储的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试构建一个python脚本,该脚本具有跨大量数据集的工作进程池(使用mutiprocessing.Pool).

I'm attempting to build a python script that has a pool of worker processes (using mutiprocessing.Pool) across a large set of data.

我希望每个进程都有一个唯一的对象,该对象将在该进程的多次执行中使用.

I want each process to have a unique object that gets used across multiple executes of that process.

伪代码:

def work(data):
    #connection should be unique per process
    connection.put(data)
    print 'work done with connection:', connection

if __name__ == '__main__':
    pPool = Pool() # pool of 4 processes
    datas = [1..1000]
    for process in pPool:
        #this is the part i'm asking about // how do I really do this?
        process.connection = Connection(conargs)
    for data in datas:
       pPool.apply_async(work, (data))

推荐答案

我认为类似的方法应该有效(未经测试)

I think something like that should work (not tested)

def init(*args):
    global connection
    connection = Connection(*args)
pPool = Pool(initializer=init, initargs=conargs) 

这篇关于Python 2.6:使用multiprocessing.Pool时处理本地存储的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆