从多个进程同时写入同一个文件? [英] write to the same file from multiple processes at the same time?

查看:128
本文介绍了从多个进程同时写入同一个文件?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述




我想要实现的目标:

i有一个cgi文件,它将一个条目写入文本文件..

就像一个日志条目(什么时候被调用,什么时候他的工作结束了。)

它是一行文字。


问题是:

如果2个用户同时调用cgi会发生什么?


它会发生,因为我现在正在努力测试它,所以我将

并行启动5-10个请求等等。


那么,如何在python中同步多个进程? />

首先想法是cgi每次都会创建一个新的临时文件,

,在压力测试结束时,我会收集内容所有那些

文件。但这似乎是一种愚蠢的做法:(


另一个想法是使用一个简单的数据库(sqlite?),这可能已经

这个问题解决了已经......


有什么更好的想法吗?


谢谢,

gabor



what i want to achieve:
i have a cgi file, that writes an entry to a text-file..
like a log entry (when was it invoked, when did his worke end).
it''s one line of text.

the problem is:
what happens if 2 users invoke the cgi at the same time?

and it will happen, because i am trying now to stress test it, so i will
start 5-10 requests in parallel and so on.

so, how does one synchronizes several processes in python?

first idea was that the cgi will create a new temp file every time,
and at the end of the stress-test, i''ll collect the content of all those
files. but that seems as a stupid way to do it :(

another idea was to use a simple database (sqlite?) which probably has
this problem solved already...

any better ideas?

thanks,
gabor

推荐答案

gabor< ga *** @ nekomancer.net>写道:
gabor <ga***@nekomancer.net> writes:
那么,如何在python中同步多个进程?

第一个想法是cgi每次都会创建一个新的临时文件,
并且在压力测试结束时,我会收集所有这些文件的内容。但这似乎是一种愚蠢的方式:(
so, how does one synchronizes several processes in python?

first idea was that the cgi will create a new temp file every time,
and at the end of the stress-test, i''ll collect the content of all
those files. but that seems as a stupid way to do it :(




最近有一个关于这个的帖子(低端持久性

策略")对于Unix,最简单的答案似乎是

fcntl.flock函数。对于Windows我不知道答案。

也许os.open with O_EXCL有效。



There was a thread about this recently ("low-end persistence
strategies") and for Unix the simplest answer seems to be the
fcntl.flock function. For Windows I don''t know the answer.
Maybe os.open with O_EXCL works.


gabor< ga *** @ nekomancer.net>写道:
gabor <ga***@nekomancer.net> wrote:
那么,如何在python中同步几个进程?
so, how does one synchronizes several processes in python?




这是一个非常难以解决的问题,一般情况下,答案

更多地取决于您运行的操作系统,而不是您使用的

编程语言。


另一方面,你说每个进程一次只写一行

的输出。如果你在每条消息写完后调用flush(),那么应该足以确保每一行写入一个

写系统调用,这反过来应该足够好,以确保

单个输出行不会在日志文件中加扰。


如果你想做得更好,你需要深入研究特定于OS的

类似于unix上fcntl模块中的flock函数。



This is a very hard problem to solve in the general case, and the answer
depends more on the operating system you''re running on than on the
programming language you''re using.

On the other hand, you said that each process will be writing a single line
of output at a time. If you call flush() after each message is written,
that should be enough to ensure that the each line gets written in a single
write system call, which in turn should be good enough to ensure that
individual lines of output are not scrambled in the log file.

If you want to do better than that, you need to delve into OS-specific
things like the flock function in the fcntl module on unix.


Roy Smith写道:
Roy Smith wrote:
gabor< ga *** @ nekomancer.net>写道:
另一方面,你说每个进程将一次写出一行输出。如果在每条消息写入后调用flush(),那么这应该足以确保每行写入一次写入系统调用,这反过来应该足以确保
各个输出行不会在日志文件中加扰。


不幸的是,这假设open()调用总是会成功,

实际上有时候当另一个文件有时会失败

已经打开文件但尚未完成写入,AFAIK。

如果你想做得更好,你需要深入研究特定于操作系统的东西,比如在unix上的fcntl模块中的flock函数。
gabor <ga***@nekomancer.net> wrote:
On the other hand, you said that each process will be writing a single line
of output at a time. If you call flush() after each message is written,
that should be enough to ensure that the each line gets written in a single
write system call, which in turn should be good enough to ensure that
individual lines of output are not scrambled in the log file.
Unfortunately this assumes that the open() call will always succeed,
when in fact it is likely to fail sometimes when another file has
already opened the file but not yet completed writing to it, AFAIK.
If you want to do better than that, you need to delve into OS-specific
things like the flock function in the fcntl module on unix.





-Peter



The OP was probably on the right track when he suggested that things
like SQLite (conveniently wrapped with PySQLite) had already solved this
problem.

-Peter


这篇关于从多个进程同时写入同一个文件?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆