芹菜中每个任务的分叉过程 [英] Forking processes for every task in Celery
问题描述
我目前使用Python的C扩展库,但似乎有内存泄漏.我的celeryd
上运行的任务使用此C扩展库执行某些操作,大约一个小时后celeryd
占用了大量内存.由于多种原因,我无法修补此C扩展库,但我想为Celery中的每个任务派生进程.芹菜有这样的选择吗?
I currently use a C extension library for Python, but it seems to have memory leaks. Tasks that are run on my celeryd
do something using this C extension library, and celeryd
eats a lot of memory about a hour later. I cannot patch this C extension library in many reasons, but instead I want to fork processes for every task in Celery. Are there any such options for Celery?
推荐答案
您可以使用CELERYD_MAX_TASKS_PER_CHILD
选项或--maxtasksperchild
芹菜开关.
You can use CELERYD_MAX_TASKS_PER_CHILD
option or --maxtasksperchild
celeryd switch.
在每个任务之后重新启动工作进程:
To restart worker processes after every task:
CELERYD_MAX_TASKS_PER_CHILD=1
https://celery .readthedocs.org/en/latest/userguide/workers.html#max-tasks-per-child-setting
这篇关于芹菜中每个任务的分叉过程的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!