如何使用Celery守护程序自动重新加载任务模块? [英] How can I automatically reload tasks modules with Celery daemon?
问题描述
我正在使用Fabric部署一个Celery代理(运行RabbitMQ)和具有通过 supervisor
守护的 celeryd
的多个Celery worker.我一辈子都无法弄清楚如何在不重新启动服务器的情况下重新加载 tasks.py
模块.
/etc/supervisor/conf.d/celeryd.conf
[program:celeryd]目录=/fab-mrv/celeryd环境= [这里是RABBITMQ凭据]命令= xvfb运行celeryd --loglevel = INFO --autoreloadautostart = trueautorestart = true
celeryconfig.py
import os##经纪人设定BROKER_URL ="amqp://%s:%s @ hostname"%(os.environ ["RMQU"],os.environ ["RMQP"])#芹菜启动时要导入的模块列表.CELERY_IMPORTS =(任务",)##使用数据库存储任务状态和结果.CELERY_RESULT_BACKEND ="amqp"CELERYD_POOL_RESTARTS =真
其他信息
-
celery --version
3.0.19(幻灯片) -
python --version
2.7.3 -
lsb_release -a
Ubuntu 12.04.2 LTS -
rabbitmqctl状态
... 2.7.1 ...
以下是我尝试过的一些事情:
- celeryd --autoreload 标志
-
sudo administratorctl重启celeryd
-
celery.control.broadcast('pool_restart',arguments = {'reload':True})
-
ps auxww |grep celeryd |grep -v grep |awk'{print $ 2}'|xargs kill -HUP
不幸的是,没有任何原因导致工作人员重新加载task.py模块(例如,在运行 git pull
更新文件之后).有关fab功能的要点,可此处.
重新启动后,代理/工作人员运行正常.
只需使用 celeryd --autoreload
选项在黑暗中射击,即可确保您拥有文件系统通知后端?它推荐用于Linux的PyNotify,因此,首先请确保已安装.
I am using Fabric to deploy a Celery broker (running RabbitMQ) and multiple Celery workers with celeryd
daemonized through supervisor
. I cannot for the life of me figure out how to reload the tasks.py
module short of rebooting the servers.
/etc/supervisor/conf.d/celeryd.conf
[program:celeryd]
directory=/fab-mrv/celeryd
environment=[RABBITMQ crendentials here]
command=xvfb-run celeryd --loglevel=INFO --autoreload
autostart=true
autorestart=true
celeryconfig.py
import os
## Broker settings
BROKER_URL = "amqp://%s:%s@hostname" % (os.environ["RMQU"], os.environ["RMQP"])
# List of modules to import when celery starts.
CELERY_IMPORTS = ("tasks", )
## Using the database to store task state and results.
CELERY_RESULT_BACKEND = "amqp"
CELERYD_POOL_RESTARTS = True
Additional information
celery --version
3.0.19 (Chiastic Slide)python --version
2.7.3lsb_release -a
Ubuntu 12.04.2 LTSrabbitmqctl status
... 2.7.1 ...
Here are some things I have tried:
- The
celeryd --autoreload
flag sudo supervisorctl restart celeryd
celery.control.broadcast('pool_restart', arguments={'reload': True})
ps auxww | grep celeryd | grep -v grep | awk '{print $2}' | xargs kill -HUP
And unfortunately, nothing causes the workers to reload the tasks.py module (e.g. after running git pull
to update the file). The gist of the relevant fab functions is available here.
The brokers/workers run fine after a reboot.
Just a shot in the dark, with the celeryd --autoreload
option did you make sure you have one of the file system notification backends? It recommends PyNotify for linux, so I'd start by making sure you have that installed.
这篇关于如何使用Celery守护程序自动重新加载任务模块?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!