运行“独特"芹菜的任务 [英] Running "unique" tasks with celery

查看:40
本文介绍了运行“独特"芹菜的任务的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用 celery 更新我的新闻聚合站点中的 RSS 提要.我为每个提要使用一个 @task,一切似乎都运行良好.

I use celery to update RSS feeds in my news aggregation site. I use one @task for each feed, and things seem to work nicely.

有一个细节我不确定是否能很好地处理:所有提要每分钟更新一次@periodic_task,但是如果提要在启动新任务时仍在从上一个定期任务更新怎么办?(例如,如果提要真的很慢,或者离线并且任务处于重试循环中)

There's a detail that I'm not sure to handle well though: all feeds are updated once every minute with a @periodic_task, but what if a feed is still updating from the last periodic task when a new one is started ? (for example if the feed is really slow, or offline and the task is held in a retry loop)

目前我存储任务结果并像这样检查它们的状态:

Currently I store tasks results and check their status like this:

import socket
from datetime import timedelta
from celery.decorators import task, periodic_task
from aggregator.models import Feed


_results = {}


@periodic_task(run_every=timedelta(minutes=1))
def fetch_articles():
    for feed in Feed.objects.all():
        if feed.pk in _results:
            if not _results[feed.pk].ready():
                # The task is not finished yet
                continue
        _results[feed.pk] = update_feed.delay(feed)


@task()
def update_feed(feed):
    try:
        feed.fetch_articles()
    except socket.error, exc:
        update_feed.retry(args=[feed], exc=exc)

也许有一种更复杂/更强大的方法可以使用我错过的一些 celery 机制来实现相同的结果?

Maybe there is a more sophisticated/robust way of achieving the same result using some celery mechanism that I missed ?

推荐答案

来自官方文档:确保一次只执行一个任务.

From the official documentation: Ensuring a task is only executed one at a time.

这篇关于运行“独特"芹菜的任务的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆