多处理产生失效的过程 [英] multiprocessing produces defunct process

查看:84
本文介绍了多处理产生失效的过程的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我将Tornado用作Web服务器,用户可以通过前端页面提交任务,审核后可以开始提交的任务.在这种情况下,我想启动一个异步子进程来处理任务,因此我在请求处理程序中编写以下代码:

I use Tornado as a web server, user can submit a task through the front end page, after auditing they can start the submitted task. In this situation, i want to start an asynchronous sub process to handle the task, so i write the following code in an request handler:

def task_handler():
    // handle task here

def start_a_process_for_task():
    p = multiprocessing.Process(target=task_handler,args=())
    p.start()
    return 0

我不在乎子流程,只是为它启动一个流程并返回到前端页面,并告诉用户任务已启动.任务本身将在后台运行,并将其状态或结果记录到数据库中,因此用户 稍后可以在网页上查看.所以在这里我不想使用阻塞的p.join(),但是在任务完成后没有p.join(),子进程将成为已失效的进程,并且当Tornado作为守护程序运行并且永远不会退出时,已失效过程永远不会消失.

I don't care about the sub process and just start a process for it and return to the front end page and tell user the task is started. The task itself will run in the background and will record it's status or results to database so user can view on the web page later. So here i don't want to use p.join() which is blocking, but without p.join() after the task finished,the sub process becomes a defunct process and as Tornado runs as a daemon and never exits, the defunct process will never disappear.

任何人都知道如何解决此问题,谢谢.

Anyone knows how to fix this problem, thanks.

推荐答案

如果不想创建僵尸,则需要加入子流程.您可以在线程中执行此操作.

You need to join your subprocesses if you do not want to create zombies. You can do it in threads.

这是一个虚拟的例子. 10秒后,所有子进程都消失了,而不再是僵尸了.这将为每个子进程启动一个线程.线程不需要加入或等待.线程执行子流程,将其加入,然后在子流程完成后退出线程.

This is a dummy example. After 10 seconds, all your subprocesses are gone instead of being zombies. This launches a thread for every subprocess. Threads do not need to be joined or waited. A thread executes subprocess, joins it and then exits the thread as soon as the subprocess is completed.

import multiprocessing
import threading
from time import sleep

def task_processor():
    sleep(10)

class TaskProxy(threading.Thread):
    def __init__(self):
        super(TaskProxy, self).__init__()

    def run(self):
        p = multiprocessing.Process(target=task_processor,args=())
        p.start()
        p.join()

def task_handler():
    t = TaskProxy()
    t.daemon = True
    t.start()
    return

for _ in xrange(0,20):
    task_handler()

sleep(60)

这篇关于多处理产生失效的过程的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆