Python + Celery:链接工作? [英] Python+Celery: Chaining jobs?
问题描述
Celery文档建议让任务等待其他任务的结果是个坏主意……但是建议的解决方案(请参见好标题)仍有待改进。具体来说,没有明确的方法可以将子任务的结果返回给调用者(也很丑陋)。
The Celery documentation suggests that it's a bad idea to have tasks wait on the results of other tasks… But the suggested solution (see "good" heading) leaves a something to be desired. Specifically, there's no clear way of getting the subtask's result back to the caller (also, it's kind of ugly).
因此,有什么方法可以链接作业,那么呼叫者会得到最终工作的结果吗?例如,使用 add
示例:
So, is there any way of "chaining" jobs, so the caller gets the result of the final job? Eg, to use the add
example:
>>> add3 = add.subtask(args=(3, ))
>>> add.delay(1, 2, callback=add3).get()
6
或者,可以返回Result实例吗?例如:
Alternately, is it OK to return instances of Result? For example:
@task
def add(x, y, callback=None):
result = x + y
if callback:
return subtask(callback).delay(result)
return result
这可以通过以下简单方法检索链中最终工作的结果:
This would let the result of the "final" job in the chain could be retrived with a simple:
result = add(1, 2, callback=add3).delay()
while isinstance(result, Result):
result = result.get()
print "result:", result
推荐答案
您可以使用芹菜链。请参见 https://celery.readthedocs.org/en/latest/userguide/canvas.html#chains
You can do it with a celery chain. See https://celery.readthedocs.org/en/latest/userguide/canvas.html#chains
@task()
def add(a, b):
time.sleep(5) # simulate long time processing
return a + b
连锁工作:
# import chain from celery import chain
# the result of the first add job will be
# the first argument of the second add job
ret = chain(add.s(1, 2), add.s(3)).apply_async()
# another way to express a chain using pipes
ret2 = (add.s(1, 2) | add.s(3)).apply_async()
...
# check ret status to get result
if ret.status == u'SUCCESS':
print "result:", ret.get()
这篇关于Python + Celery:链接工作?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!