多处理队列的最大大小限制为32767 [英] Multiprocessing Queue maxsize limit is 32767

查看:153
本文介绍了多处理队列的最大大小限制为32767的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用多处理程序编写Python 2.6(OSX)程序,并且我想用超过32767个默认值的队列填充队列.

I'm trying to write a Python 2.6 (OSX) program using multiprocessing, and I want to populate a Queue with more than the default of 32767 items.

from multiprocessing import Queue
Queue(2**15) # raises OSError

Queue(32767)可以正常工作,但是任何更高的数字(例如Queue(32768))都会失败,并显示OSError: [Errno 22] Invalid argument

Queue(32767) works fine, but any higher number (e.g. Queue(32768)) fails with OSError: [Errno 22] Invalid argument

是否有解决此问题的方法?

Is there a workaround for this issue?

推荐答案

一种方法是用自定义类包装multiprocessing.Queue(仅在生产者端,或从消费者角度透明).使用该方法,您可以将要分派到要包装的Queue对象的项目排队,并且仅在空间可用时将其从本地队列(Python list()对象)中的内容馈送到multiprocess.Queue中,并进行异常处理在Queue充满时进行调节.

One approach would be to wrap your multiprocessing.Queue with a custom class (just on the producer side, or transparently from the consumer perspective). Using that you would queue up items to be dispatched to the Queue object that you're wrapping, and only feed things from the local queue (Python list() object) into the multiprocess.Queue as space becomes available, with exception handling to throttle when the Queue is full.

这可能是最简单的方法,因为它对其余代码的影响应该最小.在将基础multiprocessing.Queue隐藏在抽象背后之后,自定义类的行为应类似于Queue.

That's probably the easiest approach since it should have the minimum impact on the rest of your code. The custom class should behave just like a Queue while hiding the underlying multiprocessing.Queue behind your abstraction.

(一种方法可能是让您的生产者使用线程,一个线程来管理从线程Queue到您的multiprocessing.Queue的调度,而实际上其他任何线程只是在喂线程Queue.)

(One approach might be to have your producer use threads, one thread to manage the dispatch from a threading Queue to your multiprocessing.Queue and any other threads actually just feeding the threading Queue).

这篇关于多处理队列的最大大小限制为32767的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆