如何使用 Completed_count 跟踪 Celery 中的任务组完成情况? [英] How to use completed_count to track task group completion in Celery?

查看:39
本文介绍了如何使用 Completed_count 跟踪 Celery 中的任务组完成情况?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用completed_count()"来跟踪 Celery 中一个组中剩余的任务数.

I am trying to use "completed_count()" to track how many tasks are left in a group in Celery.

我的客户"运行这个:

from celery import group
from proj import do

wordList=[]
with open('word.txt') as wordData:
     for line in wordData:
         wordList.append(line)

readAll = group(do.s(i) for i in wordList)

result = readAll.apply_async()
while not result.ready():
     print(result.completed_count())
result.get()

'word.txt' 只是一个文件,每行一个字.

The 'word.txt" is just a file with one word on each line.

然后我将芹菜工人设置为运行 do 任务:

Then I have the celery worker(s) set to run the do task as:

@app.task(task_acks_late = True)
def do(word):
    sleep(1)
    return f"I'm doing {word}"

我的经纪人是 pyamqp,我使用 rpc 作为后端.

My broker is pyamqp and I use rpc for the backend.

我认为它会在客户端为每个循环打印越来越多的任务,但我得到的只是0".

I thought it would print an increasing count of tasks for each loop on the client side but all I get are "0"s.

推荐答案

问题不在 completed_count 方法中.因为 result.ready() 在所有任务完成后保持 False ,所以你得到零.我们的 rpc 后端似乎有一个错误, github.考虑将后端设置更改为 amqp,正如我所见

The problem is not in completed_count method. You are getting zeros because of result.ready() stays False after all the tasks have been completed. It seems like we have a bug with rpc backend, there is an issue on github. Consider to change the backend setting to amqp, it is working correctly as I can see

这篇关于如何使用 Completed_count 跟踪 Celery 中的任务组完成情况?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆