Celery(Django)限速 [英] Celery (Django) Rate limiting

查看:194
本文介绍了Celery(Django)限速的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Celery处理多个数据挖掘任务.这些任务之一连接到远程服务,该服务最多允许每用户最多10个同时连接 (或换句话说,它 CAN 在全球范围内超过10个连接,但每个单独的作业不能超过10个连接).

I'm using Celery to process multiple data-mining tasks. One of these tasks connects to a remote service which allows a maximum of 10 simultaneous connections per user (or in other words, it CAN exceed 10 connections globally but it CANNOT exceed 10 connections per individual job).

认为 令牌桶(速率限制)是我正在寻找的东西,但似乎找不到任何实现.

I THINK Token Bucket (rate limiting) is what I'm looking for, but I can't seem to find any implementation of it.

推荐答案

经过大量研究,我发现Celery没有明确提供限制此类并发实例数量的方法,而且这样做通常被认为是不好的.练习.

After much research I found out that Celery does not explicitly provide a way to limit the number of concurrent instances like this and furthermore, doing so would generally be considered bad practice.

更好的解决方案是在单个任务中并发下载,并使用Redis或Memcached存储和分发以供其他任务处理.

The better solution would be to download concurrently within a single task, and use Redis or Memcached to store and distribute for other tasks to process.

这篇关于Celery(Django)限速的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆