与python中的SPSC队列进行进程间通信 [英] Interprocess communication with SPSC queue in python

查看:177
本文介绍了与python中的SPSC队列进行进程间通信的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有多个写繁重的Python应用程序(producer1.pyproducer2.py,...),我想实现一个异步,无阻塞的编写器(consumer.py)作为一个单独的进程,因此生产者不受磁盘访问或争用的阻止.

为使此操作更容易实现,假设我只需要公开一个日志记录调用,该记录调用将一个固定长度的字符串从生产者传递到编写者,并且写入文件不需要按调用时间进行排序.并且目标平台可以是仅Linux.我应该如何在调用线程上以最小的延迟代价实现这一目标?

这似乎是用于多个无锁SPSC队列的理想设置,但我找不到任何Python实现.


编辑1

我可以将循环缓冲区实现为/dev/shm上的内存映射文件,但是我不确定是否在Python中具有原子CAS?

解决方案

最简单的方法是在consumer.py中使用异步TCP/Unix套接字服务器. 在这种情况下,使用HTTP将是一项开销.

生产者,TCP/Unix Socket客户端,将数据发送给使用者,然后使用者将立即响应,然后再将数据写入磁盘驱动器.

使用者中的文件IO处于阻止状态,但如上所述不会阻止生产者.

I have multiple write-heavy Python applications (producer1.py, producer2.py, ...) and I'd like to implement an asynchronous, non-blocking writer (consumer.py) as a separate process, so that the producers are not blocked by disk access or contention.

To make this more easily optimizable, assume I just need to expose a logging call that passes a fixed length string from a producer to the writer, and the written file does not need to be sorted by call time. And the target platform can be Linux-only. How should I implement this with minimal latency penalty on the calling thread?

This seems like an ideal setup for multiple lock-free SPSC queues but I couldn't find any Python implementations.


Edit 1

I could implement a circular buffer as a memory-mapped file on /dev/shm, but I'm not sure if I'll have atomic CAS in Python?

解决方案

The simplest way would be using an async TCP/Unix Socket server in consumer.py.
Using HTTP will be an overhead in this case.

A producer, TCP/Unix Socket client, will send data to consumer then consumer will respond right away before writing data in disk drive.

File IO in consumer are blocking but it will not block producers as stated above.

这篇关于与python中的SPSC队列进行进程间通信的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆