python multiprocessing-进程在大队列的联接上挂起 [英] python multiprocessing - process hangs on join for large queue

查看:97
本文介绍了python multiprocessing-进程在大队列的联接上挂起的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在运行python 2.7.3,发现以下奇怪行为.考虑这个最小的例子:

I'm running python 2.7.3 and I noticed the following strange behavior. Consider this minimal example:

from multiprocessing import Process, Queue

def foo(qin, qout):
    while True:
        bar = qin.get()
        if bar is None:
            break
        qout.put({'bar': bar})

if __name__ == '__main__':
    import sys

    qin = Queue()
    qout = Queue()
    worker = Process(target=foo,args=(qin,qout))
    worker.start()

    for i in range(100000):
        print i
        sys.stdout.flush()
        qin.put(i**2)

    qin.put(None)
    worker.join()

当我循环超过10,000个或更多时,我的脚本挂在worker.join()上.当循环仅达到1,000时,效果很好.

When I loop over 10,000 or more, my script hangs on worker.join(). It works fine when the loop only goes to 1,000.

有什么想法吗?

推荐答案

子进程中的qout队列已满.您从foo()中放入的数据不适合内部使用的OS管道的缓冲区,因此子进程将阻止尝试容纳更多数据.但是父进程没有读取该数据:它也被简单地阻塞,等待子进程完成.这是一个典型的僵局.

The qout queue in the subprocess gets full. The data you put in it from foo() doesn't fit in the buffer of the OS's pipes used internally, so the subprocess blocks trying to fit more data. But the parent process is not reading this data: it is simply blocked too, waiting for the subprocess to finish. This is a typical deadlock.

这篇关于python multiprocessing-进程在大队列的联接上挂起的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆