如何登录到Django的并发处理一个单一的文件,而无需独占锁 [英] How to log to a single file with concurrent processes in Django without exclusive locks

查看:941
本文介绍了如何登录到Django的并发处理一个单一的文件,而无需独占锁的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

哪有这个应用程序日志,以一个单一的共享日志文件(在网络共享),不保持这个文件以独占模式永久开放?

Given a Django application that is being executed concurrently on multiple servers, how can this application log to a single shared log file (in a network share), without keeping this file permanently open in exclusive mode?

当你想利用日志流的优势,这种情况适用于托管在Windows Azure上的网站Django应用程序。

This situation applies to Django applications hosted on Windows Azure Websites when you want to take advantage of log streaming.

此示例项目,我使用的 ConcurrentLogHandler 是这样的:

On this sample project, I've trying using ConcurrentLogHandler like this:

在<一个href=\"https://github.com/fernandoacorreia/DjangoWAWSLogging/blob/master/DjangoWAWSLogging/DjangoWAWSLogging/settings.py\"相对=nofollow> settings.py :

'ConcurrentLogHandler':{
    'level': 'DEBUG',
    'class': 'cloghandler.ConcurrentRotatingFileHandler',
    'formatter': 'verbose',
    'filename': os.getenv('LOGFILE', 'django.log')
},

在<一个href=\"https://github.com/fernandoacorreia/DjangoWAWSLogging/blob/master/DjangoWAWSLogging/DjangoWAWSLogging/views.py\"相对=nofollow> views.py :

from time import gmtime, strftime
import logging
from django.http import HttpResponse

logger = logging.getLogger(__name__)

def home(request):
    current_time = strftime("%Y-%m-%d %H:%M:%S", gmtime())
    logger.info('home ' + current_time)
    return HttpResponse("Hello from Django! It is now " + current_time + ".\n")

该日志被写入但该文件似乎并没有在网站运行时被刷新。另外,如果我尝试使用FTP读取文件我得到这个消息:因为它正由另一个进程使用550进程无法访问文件

The logs are written but the file doesn't seem to be flushed while the website is running. Also, if I try to read the file using FTP I get this message: "550 The process cannot access the file because it is being used by another process."

如果我停止该应用程序,该文件是封闭的,我可以读取该文件,并查看其所有的日志。

If I stop the application, the file is closed and I can read the file and see all the logs in it.

我认为ConcurrentLogHandler将允许日志文件的共享访问。这是假设错了吗?有没有需要一些额外的配置?是否有其他选择吗?

I assume that ConcurrentLogHandler would allow shared access to the log file. Is this assumption wrong? Is there some additional configuration needed? Is there an alternative?

推荐答案

另一种方法是对所有Django的日志记录被发送到一个队列(如队列Redis的,使用类似的这个 multiprocessing.Queue ),然后一个进程读取队列并将记录写入到文件。还有更多的移动部件,所以这可能会或可能不适合你的需求,但它会消除文件争。使用记录时,请参阅了更多的选择这个职位从多个进程。

An alternative would be for all Django logging to be sent to a queue (e.g. a Redis queue, using something like this, or a multiprocessing.Queue) and then a single process reads the queue and writes records to file. There are more moving parts, so this may or may not be appropriate for your needs, but it would eliminate the file contention. See this post for more options when using logging from multiple processes.

您当然也可以设置一个套接字服务器,并使用的SocketHandler 发送记录事件的所有Django的处理到服务器,这将写入到文件。 Python文档包含<一个href=\"http://docs.python.org/2/howto/logging-cookbook.html#sending-and-receiving-logging-events-across-a-network\"相对=nofollow>工作示例这样的服务器。

You can of course also set up a socket server and use a SocketHandler to send logging events from all Django processes to the server, which writes to file. The Python docs contain a working example of such a server.

这篇关于如何登录到Django的并发处理一个单一的文件,而无需独占锁的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆