Serilog 滚动日志仅在一个文件中 [英] Serilog Rolling logs in one file only

查看:53
本文介绍了Serilog 滚动日志仅在一个文件中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

有没有办法设置 Serilog 以在保持最大文件大小的同时继续登录同一个文件?

Is there a way to setup Serilog to keep logging in the same file while maintaining a max file size?

换句话说,如果我指定最大文件大小为 100MB,则进程应该在添加新条目之前从文件中删除较早的条目.

In other words, If I specify the max file size to be 100MB, the process should remove earlier entries from the file before adding new ones.

推荐答案

TL;DR no;File(或它的 RollingFile 前身)没有提供这样的功能,并且不太可能在任何时候为磁盘支持的日志提供这样的功能.

TL;DR no; the File (or its RollingFile predecessor) doesn't provide such a facility and is unlikely to do so at any point for disk backed logs.

因此,可用的最佳解决方案是将最大数量设置为 2 个日志.

So, the best solution available is to set a max count of 2 logs.

实现你所描述的根本问题是大多数文件系统的工作方式,它需要重写整个文件以删除被修剪的东西 - 这意味着作者会阻止事情(并导致不必要的工作)当这发生.另一个需要解决的问题是管理隐含在多个编写器中的竞争条件,所有编写器都试图同时(和/或使用稍微不同的参数)进行修剪.

The root issue with achieving what you describe is that the way most filesystems work, it would necessitate rewriting the entire file to remove the stuff being trimmed - this would mean the writer would be blocking things (and causing undue work) when this takes place. Another issue that would need to be surmounted would be managing the race condition implicit in multiple writers all trying to effect the trim at the same time (and/or with slightly different parameters).

(如果您查看 Serilog Github 问题列表,您会不时看到人们通过要求在文件顶部具有最新信息的日志以不同方式提出同一问题)

(If you look in the Serilog Github issues list, you'll see people from time to time asking the same question in a different way by asking for a log that has the most recent information at the top of the file)

更新:有人入侵 -这个问题涵盖了这个领域 ni 很好的细节 - 我的评论比这个答案更完整和更有条理;)

UPDATE: Someone hacked it in - that issue covers this territory ni good details - my comment there is more complete and organized than this answer ;)

这篇关于Serilog 滚动日志仅在一个文件中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆