内部多处理池处理超时 [英] Multiprocessing Pool inside Process time out

查看:366
本文介绍了内部多处理池处理超时的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

每当我使用以下代码时,池结果总是返回超时,我在做逻辑上不正确的事情吗?

When ever I use the following code the pool result always returns a timeout, is there something logically incorrect I am doing?

from multiprocessing import Pool, Process, cpu_count

def add(num):
  return num+1

def add_wrap(num):
  new_num = ppool.apply_async(add, [num])
  print new_num.get(timeout=3)

ppool = Pool(processes=cpu_count() )

test = Process(target=add_wrap, args=(5,)).start()

我知道此错误,并且本来以为已将其修复. python 2.6.4?

I'm aware of this bug, and would have thought that it would have been fixed in python 2.6.4?

推荐答案

您不能在进程之间传递Pool对象.

You can't pass Pool objects between processes.

如果您尝试这段代码,Python将引发异常:'NotImplementedError:池对象不能在进程之间传递或腌制.'

If you try this code, Python will raise a exception : 'NotImplementedError: pool objects cannot be passed between processes or pickled'.

from multiprocessing import Queue, Pool

q = Queue()
ppool = Pool(processes=2)                                                       
q.put([ppool])
ppool = q.get()

因此,如果您希望代码能正常工作,只需在add_wrap方法中创建Pool对象.

So if you want your code to work, just create your Pool object in the add_wrap method.

from multiprocessing import Pool, Process, cpu_count

def add(num):
  return num+1

def add_wrap(num):
  ppool = Pool(processes=cpu_count() )
  new_num = ppool.apply_async(add, [num])
  print new_num.get(timeout=3)

test = Process(target=add_wrap, args=(5,)).start()

这篇关于内部多处理池处理超时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆