multiprocessing.Queue项目的最大大小? [英] Maximum size for multiprocessing.Queue item?

查看:71
本文介绍了multiprocessing.Queue项目的最大大小?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在用Python进行一个相当大的项目,该项目需要将计算密集型后台任务之一卸载到另一个内核,以免降低主服务的速度.使用multiprocessing.Queue传达工作进程的结果时,我遇到了一些明显的奇怪行为.为了进行比较,对于threading.Threadmultiprocessing.Process使用相同的队列,该线程工作得很好,但是在将较大的项目放入队列后,该进程无法加入.观察:

I'm working on a fairly large project in Python that requires one of the compute-intensive background tasks to be offloaded to another core, so that the main service isn't slowed down. I've come across some apparently strange behaviour when using multiprocessing.Queue to communicate results from the worker process. Using the same queue for both a threading.Thread and a multiprocessing.Process for comparison purposes, the thread works just fine but the process fails to join after putting a large item in the queue. Observe:

import threading
import multiprocessing

class WorkerThread(threading.Thread):
    def __init__(self, queue, size):
        threading.Thread.__init__(self)
        self.queue = queue
        self.size = size

    def run(self):
        self.queue.put(range(size))


class WorkerProcess(multiprocessing.Process):
    def __init__(self, queue, size):
        multiprocessing.Process.__init__(self)
        self.queue = queue
        self.size = size

    def run(self):
        self.queue.put(range(size))


if __name__ == "__main__":
    size = 100000
    queue = multiprocessing.Queue()

    worker_t = WorkerThread(queue, size)
    worker_p = WorkerProcess(queue, size)

    worker_t.start()
    worker_t.join()
    print 'thread results length:', len(queue.get())

    worker_p.start()
    worker_p.join()
    print 'process results length:', len(queue.get())

我已经看到对于size = 10000可以正常使用,但是对于size = 100000则挂在worker_p.join()上.在multiprocessing.Queue中可以放置哪些multiprocessing.Process实例是否有一些固有的大小限制?还是我在这里犯了一些明显的根本性错误?

I've seen that this works fine for size = 10000, but hangs at worker_p.join() for size = 100000. Is there some inherent size limit to what multiprocessing.Process instances can put in a multiprocessing.Queue? Or am I making some obvious, fundamental mistake here?

作为参考,我在Ubuntu 10.04上使用Python 2.6.5.

For reference, I am using Python 2.6.5 on Ubuntu 10.04.

推荐答案

似乎底层管道已满,因此feeder线程在写入管道时发生阻塞(实际上是在尝试获取保护管道以防止并发访问的锁时)

Seems the underlying pipe is full, so the feeder thread blocks on the write to the pipe (actually when trying to acquire the lock protecting the pipe from concurrent access).

检查此问题 http://bugs.python.org/issue8237

这篇关于multiprocessing.Queue项目的最大大小?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆