使用Python多处理的子进程内创建子进程失败 [英] Create child processes inside a child process with Python multiprocessing failed

查看:602
本文介绍了使用Python多处理的子进程内创建子进程失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当尝试在Python中创建嵌套子进程时,我观察到了这种行为.这是父程序parent_process.py:

I observed this behavior when trying to create nested child processes in Python. Here is the parent program parent_process.py:

import multiprocessing
import child_process

pool = multiprocessing.Pool(processes=4)
for i in range(4):
        pool.apply_async(child_process.run, ())
pool.close()
pool.join()

父程序在以下子程序child_process.py中调用运行"功能:

The parent program calls the "run" function in the following child program child_process.py:

import multiprocessing

def run():
        pool = multiprocessing.Pool(processes=4)
        print 'TEST!'
        pool.close()
        pool.join()

当我运行父程序时,什么也没打印出来,程序迅速退出.但是,如果print 'TEST!'在创建嵌套子进程之前向上移动了一行,则'TEST!'将被打印4次.

When I run the parent program, nothing was printed out and the program exited quickly. However, if print 'TEST!' is moved one line above (before the nested child processes are created), 'TEST!' are printed for 4 times.

由于子进程中的错误不会显示在屏幕上,这似乎表明当子进程创建自己的嵌套子进程时程序崩溃.

Because errors in a child process won't print to screen, this seems to show that the program crashes when a child process creates its own nested child processes.

有人能解释一下幕后发生的事情吗?谢谢!

Could anyone explain what happens behind the scene? Thanks!

推荐答案

根据多处理文档,守护进程无法生成子进程.

According to multiprocessing documentation, daemonic processes cannot spawn child processes.

multiprocessing.Pool使用守护进程来确保程序退出时它们不会泄漏.

multiprocessing.Pool uses daemonic processes to ensure they don't leak when your program extits.

这篇关于使用Python多处理的子进程内创建子进程失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆