python multiprocessing - 进程挂起加入大队列 [英] python multiprocessing - process hangs on join for large queue

查看:20
本文介绍了python multiprocessing - 进程挂起加入大队列的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在运行 python 2.7.3,我注意到以下奇怪的行为.考虑这个最小的例子:

I'm running python 2.7.3 and I noticed the following strange behavior. Consider this minimal example:

from multiprocessing import Process, Queue

def foo(qin, qout):
    while True:
        bar = qin.get()
        if bar is None:
            break
        qout.put({'bar': bar})

if __name__ == '__main__':
    import sys

    qin = Queue()
    qout = Queue()
    worker = Process(target=foo,args=(qin,qout))
    worker.start()

    for i in range(100000):
        print i
        sys.stdout.flush()
        qin.put(i**2)

    qin.put(None)
    worker.join()

当我循环超过 10,000 个或更多时,我的脚本挂在 worker.join() 上.当循环仅达到 1,000 时,它工作正常.

When I loop over 10,000 or more, my script hangs on worker.join(). It works fine when the loop only goes to 1,000.

有什么想法吗?

推荐答案

子进程中的 qout 队列已满.您从 foo() 放入其中的数据不适合内部使用的操作系统管道的缓冲区,因此子进程会阻止尝试容纳更多数据.但是父进程并没有读取这些数据:它也只是被阻塞,等待子进程完成.这是典型的僵局.

The qout queue in the subprocess gets full. The data you put in it from foo() doesn't fit in the buffer of the OS's pipes used internally, so the subprocess blocks trying to fit more data. But the parent process is not reading this data: it is simply blocked too, waiting for the subprocess to finish. This is a typical deadlock.

这篇关于python multiprocessing - 进程挂起加入大队列的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆