在一台机器上的芹菜工人之间共享内存区域 [英] Share memory areas between celery workers on one machine
问题描述
我想在 celery 中的工作节点之间共享小块信息(例如缓存的授权令牌、统计信息...).
I want to share small pieces of informations between my worker nodes (for example cached authorization tokens, statistics, ...) in celery.
如果我在我的任务文件中创建一个全局变量,它对于每个工作人员来说都是唯一的(我的工作人员是进程,并且有 1 个任务/执行的生命周期).
If I create a global inside my tasks-file it's unique per worker (My workers are processes and have a life-time of 1 task/execution).
最佳实践是什么?我是否应该在外部 (DB) 保存状态,创建一个老式的共享内存(由于 celery 中不同的池实现可能会很困难)?
What is the best practice? Should I save the state externally (DB), create an old-fashioned shared memory (could be difficult because of the different pool implementations in celery)?
提前致谢!
推荐答案
我终于找到了一个不错的解决方案——核心python multiprocessing-Manager:
I finally found a decent solution - core python multiprocessing-Manager:
from multiprocessing import Manager
manag = Manager()
serviceLock = manag.Lock()
serviceStatusDict = manag.dict()
这个 dict 可以从每个进程访问,它是同步的,但是在并发访问它时你必须使用锁(就像在其他所有共享内存实现中一样).
This dict can be accessed from every process, it's synchronized, but you have to use a lock when accessing it concurrently (like in every other shared memory implementation).
这篇关于在一台机器上的芹菜工人之间共享内存区域的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!