使用celeryd作为带有多个Django应用程序的守护进程? [英] Using celeryd as a daemon with multiple django apps?

查看:144
本文介绍了使用celeryd作为带有多个Django应用程序的守护进程?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚刚开始使用 django-celery ,我想将celeryd设置为守护程序。 说明似乎建议只能在以下位置为一个站点/项目配置它一个时间。芹菜可以处理多个项目,还是只能处理一个项目?而且,如果是这种情况,是否有一种干净的方法可以将celeryd设置为针对每个配置自动启动,这需要我为每个配置创建一个单独的初始化脚本?

I'm just starting using django-celery and I'd like to set celeryd running as a daemon. The instructions, however, appear to suggest that it can be configured for only one site/project at a time. Can the celeryd handle more than one project, or can it handle only one? And, if this is the case, is there a clean way to set up celeryd to be automatically started for each configuration, which requiring me to create a separate init script for each one?

推荐答案

就像所有有趣的问题一样,答案是这取决于。 :)

Like all interesting questions, the answer is it depends. :)

绝对有可能提出一个场景,其中celeryd可以被两个独立的站点使用。如果多个站点将任务提交到同一交换所,并且这些任务不需要访问任何特定的数据库(例如,它们使用电子邮件地址,信用卡号或数据库记录以外的其他内容进行操作),那么一个芹菜可能足够。只需确保任务代码在所有站点和celery服务器都加载的共享模块中即可。

It is definitely possible to come up with a scenario in which celeryd can be used by two independent sites. If multiple sites are submitting tasks to the same exchange, and the tasks do not require access to any specific database -- say, they operate on email addresses, or credit card numbers, or something other than a database record -- then one celeryd may be sufficient. Just make sure that the task code is in a shared module that is loaded by all sites and the celery server.

通常,您会发现celery需要访问到数据库-它要么根据作为任务参数传递的ID加载对象,要么必须对数据库进行一些更改,或者通常是同时进行这两项操作。而且,即使多个站点/项目共享相同的应用程序,通常也不会共享数据库,因此您需要将任务队列分开。

Usually, though, you'll find that celery needs access to the database -- either it loads objects based on the ID that was passed as a task parameter, or it has to write some changes to the database, or, most often, both. And multiple sites / projects usually don't share a database, even if they share the same apps, so you'll need to keep the task queues separate .

在这种情况下,通常会发生的情况是,您设置了一个具有多个交换的消息代理(例如,RabbitMQ)。每个交换都从单个站点接收消息。然后,您需要在某个地方为每个交换运行一个或多个celeryd进程(在celery配置设置中,您必须指定交换。我不认为celeryd可以收听多个交换)。每个celeryd服务器都知道其交换,应加载的应用程序以及应连接的数据库。

In that case, what will usually happen is that you set up a single message broker (RabbitMQ, for example) with multiple exchanges. Each exchange receives messages from a single site. Then you run one or more celeryd processes somewhere for each exchange (in the celery config settings, you have to specify the exchange. I don't believe celeryd can listen to multiple exchanges). Each celeryd server knows its exchange, the apps it should load, and the database that it should connect to.

要管理这些,我建议您调查 cyme -由@asksol负责,并在必要时在多个服务器上管理多个celeryd实例。我没有尝试过,但是看起来它应该为不同的实例处理不同的配置。

To manage these, I would suggest looking into cyme -- It's by @asksol, and manages multiple celeryd instances, on multiple servers if necessary. I haven't tried, but it looks like it should handle different configurations for different instances.

这篇关于使用celeryd作为带有多个Django应用程序的守护进程?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆