python中一个很好的持久化同步队列 [英] A good persistent synchronous queue in python

查看:105
本文介绍了python中一个很好的持久化同步队列的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我不会立即关心 fifo 或 filo 选项,但将来可能会很好..

I don't immediately care about fifo or filo options, but it might be nice in the future..

我正在寻找一种很好的快速简单的方法来存储(最多一演出数据或数千万个条目)在磁盘上,可以由多个进程获取和放置.这些条目只是简单的 40 字节字符串,而不是 python 对象.并不真正需要 shelve 的所有功能.

What I'm looking for a is a nice fast simple way to store (at most a gig of data or tens of millions of entries) on disk that can be get and put by multiple processes. The entries are just simple 40 byte strings, not python objects. Don't really need all the functionality of shelve.

我见过这个 http://code.activestate.com/lists/蟒蛇列表/310105/看起来很简单.需要升级到新的Queue版本.

I've seen this http://code.activestate.com/lists/python-list/310105/ It looks simple. It needs to be upgraded to the new Queue version.

想知道是否有更好的东西?我担心在电源中断的情况下,整个腌制文件会损坏,而不仅仅是一条记录.

Wondering if there's something better? I'm concerned that in the event of a power interruption, the entire pickled file becomes corrupt instead of just one record.

推荐答案

尝试使用 Celery.它不是纯 python,因为它使用 RabbitMQ 作为后端,但它可靠、持久和分布式,总而言之,从长远来看,比使用文件或数据库要好得多.

Try using Celery. It's not pure python, as it uses RabbitMQ as a backend, but it's reliable, persistent and distributed, and, all in all, far better then using files or database in the long run.

这篇关于python中一个很好的持久化同步队列的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆