如何杀死由多处理模块创建的僵尸进程? [英] how to kill zombie processes created by multiprocessing module?

查看:102
本文介绍了如何杀死由多处理模块创建的僵尸进程?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我对multiprocessing模块非常陌生.我只是尝试创建以下内容:我有一个流程是从RabbitMQ获取消息并将其传递到内部队列(multiprocessing.Queue).然后,我想做的是:在收到新消息时生成一个进程.它可以工作,但是在完成工作后,它留下了一个僵尸进程,不会被其父进程终止.这是我的代码:

I'm very new to multiprocessing module. And I just tried to create the following: I have one process that's job is to get message from RabbitMQ and pass it to internal queue (multiprocessing.Queue). Then what I want to do is : spawn a process when new message comes in. It works, but after the job is finished it leaves a zombie process not terminated by it's parent. Here is my code:

主要过程:

 #!/usr/bin/env python

 import multiprocessing
 import logging
 import consumer
 import producer
 import worker
 import time
 import base

 conf = base.get_settings()
 logger = base.logger(identity='launcher')

 request_order_q = multiprocessing.Queue()
 result_order_q = multiprocessing.Queue()

 request_status_q = multiprocessing.Queue()
 result_status_q = multiprocessing.Queue()

 CONSUMER_KEYS = [{'queue':'product.order',
                   'routing_key':'product.order',
                   'internal_q':request_order_q}]
 #                 {'queue':'product.status',
 #                  'routing_key':'product.status',
 #                  'internal_q':request_status_q}]

 def main():
     # Launch consumers
     for key in CONSUMER_KEYS:
         cons = consumer.RabbitConsumer(rabbit_q=key['queue'],
                                        routing_key=key['routing_key'],
                                        internal_q=key['internal_q'])
         cons.start()

     # Check reques_order_q if not empty spaw a process and process message
     while True:
         time.sleep(0.5)
         if not request_order_q.empty():
             handler = worker.Worker(request_order_q.get())
             logger.info('Launching Worker')
             handler.start()

 if __name__ == "__main__":
     main()

这是我的工人:

 import multiprocessing
 import sys 
 import time
 import base

 conf = base.get_settings()
 logger = base.logger(identity='worker')

 class Worker(multiprocessing.Process):

     def __init__(self, msg):
         super(Worker, self).__init__()
         self.msg = msg 
         self.daemon = True

     def run(self):
         logger.info('%s' % self.msg)
         time.sleep(10)
         sys.exit(1)

因此,在处理完所有消息之后,我可以使用ps aux命令查看进程.但是我真的希望它们一旦完成就可以终止. 谢谢.

So after all the messages gets processed I can see processes with ps aux command. But I would really like them to be terminated once finished. Thanks.

推荐答案

以下几点:

  1. 确保父级joins及其子级避免僵尸.参见 Python多处理终止进程

  1. Make sure the parent joins its children, to avoid zombies. See Python Multiprocessing Kill Processes

您可以使用is_alive()成员函数检查子项是否仍在运行.请参见 http://docs.python.org/2/library/multiprocessing .html#multiprocessing.Process

You can check whether a child is still running with the is_alive() member function. See http://docs.python.org/2/library/multiprocessing.html#multiprocessing.Process

这篇关于如何杀死由多处理模块创建的僵尸进程?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆