定期运行Celery任务(没有Django) [英] Running Celery tasks periodically (without Django)
问题描述
我正在尝试使用Celery定期运行一些功能(任务),例如每3秒运行一次.
I am trying to run a few functions (tasks) periodically, say every 3 seconds, with Celery.
我最近得到的就是只运行一次任务.
The closest I'm getting is to just run the tasks once.
这是我的Celery配置文件:
This is my Celery configuration file:
# celeryconfig.py
from datetime import timedelta
BROKER_URL = 'amqp://guest@localhost//'
CELERY_RESULT_BACKEND = 'rpc://'
CELERYBEAT_SCHEDULE = {
'f1-every-3-seconds': {
'task': 'tasks.f1',
'schedule': timedelta(seconds=3),
'args': (1, 2)
},
'f2-every-3-seconds': {
'task': 'tasks.f2',
'schedule': timedelta(seconds=3),
'args': (3, 4)
},
}
这是我声明任务的地方:
This is where I declare the tasks:
# tasks.py:
import celeryconfig
from celery import Celery
from celery import task
dbwapp = Celery('tasks')
dbwapp.config_from_object(celeryconfig)
@dbwapp.task()
def f1(a, b):
print "F1: {0}, {1}".format(a, b)
@dbwapp.task()
def f2(a, b):
print "F2: {0}, {1}".format(a, b)
这是我的主程序将运行的地方:
And this is where my main program would run:
#tasks_runner.py:
from tasks import f1, f2, dbwapp
f1.delay(5, 6)
f2.delay(7, 8)
我使用以下代码运行代码: pythontasks_runner.py
,但是无法使这两个函数定期运行.这是我得到的输出:
I run my code with: python tasks_runner.py
but don't manage to make those two functions run periodically. This is the output that I get:
[2016-03-31 23:36:16,108: WARNING/Worker-9] F1: 5, 6
[2016-03-31 23:36:16,109: WARNING/Worker-6] F2: 7, 8
我做错了什么?如何使f1和f2定期运行?
What am I doing wrong? How do I make f1 and f2 to run periodically?
推荐答案
使用您的代码,我能够启动celery incl.这样安排任务:
Using your code, I was able to start celery incl. scheduled tasks this way:
$ celery beat (env: celery)
celery beat v3.1.23 (Cipater) is starting.
__ - ... __ - _
Configuration ->
. broker -> redis://localhost:6379/0
. loader -> celery.loaders.default.Loader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]@%INFO
. maxinterval -> now (0s)
[2016-04-01 00:15:05,377: INFO/MainProcess] beat: Starting...
[2016-04-01 00:15:08,402: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:08,410: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:11,403: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:11,411: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:14,404: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:14,412: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:17,404: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:17,412: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:20,405: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:20,413: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:23,406: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:23,413: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
[2016-04-01 00:15:26,407: INFO/MainProcess] Scheduler: Sending due task f2-every-3-seconds (tasks.f2)
[2016-04-01 00:15:26,414: INFO/MainProcess] Scheduler: Sending due task f1-every-3-seconds (tasks.f1)
它显然会加载芹菜的默认配置并启动 beat
服务启动根据当前配置触发计划的任务.
It apparently loads the default configuration for celery and starting the beat
service start
firing scheduled tasks according to current configuration.
无论如何,这只会发送执行任务的请求,但会错过实际的工作人员.工人可能是在另一个控制台中启动:
Anyway, this only sends requests to perform the tasks but miss the actual worker. The worker may be started in another console:
$ celery worker -A tasks
[2016-04-01 00:31:46,950: WARNING/MainProcess] celery@zen ready.
[2016-04-01 00:31:47,029: WARNING/Worker-4] F2: 3, 4
[2016-04-01 00:31:47,029: WARNING/Worker-2] F1: 1, 2
[2016-04-01 00:31:47,036: WARNING/Worker-3] F2: 3, 4
[2016-04-01 00:31:47,036: WARNING/Worker-1] F1: 1, 2
[2016-04-01 00:31:48,829: WARNING/Worker-4] F2: 3, 4
[2016-04-01 00:31:48,829: WARNING/Worker-2] F1: 1, 2
如果您只想使用一个工作程序,则可以使用 beat
服务立即启动它:
If you want to use only one worker, you may start it at once with the beat
service:
$ celery worker -A tasks -B
这篇关于定期运行Celery任务(没有Django)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!