TimedRotatingFileHandler 在具有多实例的 Django 中无法正常工作 [英] TimedRotatingFileHandler doesn't work fine in Django with multi-instance
问题描述
我使用 TimedRotatingFileHandler 记录 Django 日志并每天轮换,但检查日志文件,奇怪的问题是昨天日志被截断,今天的日志很少,昨天的日志丢失了!
I use TimedRotatingFileHandler to logging Django log and rotate every day, but check the log file, strange issue is yesterday log is truncated and logging few today's log, yesterday log is lost!
Django 1.4
uwsgi 1.4.9
Python 2.6
Django 1.4
uwsgi 1.4.9
Python 2.6
我用 uwsgi 启动了 8 个 django 实例.setting.py 是
I start 8 django instance with uwsgi. The setting.py is
'handlers': {
'apilog': {
'level': 'INFO',
'class': 'logging.handlers.TimedRotatingFileHandler',
'filename': os.path.join(APILOG, "apilog.log" ),
'when': 'midnight',
'formatter': 'info',
'interval': 1,
'backupCount': 0,
},
},
'loggers': {
'apilog': {
'handlers': ['apilog'],
'level': 'INFO',
'propagate': True
},
}
我错过了什么吗?为什么旧的日志会丢失?
Did I miss something? Why old logging is lost?
推荐答案
您不应该同时从多个进程登录到基于文件的处理程序 - 这不受支持,因为没有可移植的操作系统支持.
You should not be logging to a file-based handler from multiple processes concurrently - that is not supported, as there is no portable OS support for it.
>
要从多个进程登录到单个目的地,您可以使用以下方法之一:
To log to a single destination from multiple processes, you can use one of the following approaches:
- 使用类似
ConcurrentLogHandler
- 使用
SysLogHandler
(或 Windows 上的NTEventLogHandler
) - 使用
SocketHandler
将日志发送到单独的进程以写入文件 - 使用带有
multiprocessing.Queue
的QueueHandler
,如此处.
- Use something like
ConcurrentLogHandler
- Use a
SysLogHandler
(orNTEventLogHandler
on Windows) - Use a
SocketHandler
which sends the logs to a separate process for writing to file - Use a
QueueHandler
with amultiprocessing.Queue
, as outlined here.
这篇关于TimedRotatingFileHandler 在具有多实例的 Django 中无法正常工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!