子过程stdout / stderr到有限大小的日志文件 [英] subprocess stdout/stderr to finite size logfile

查看:154
本文介绍了子过程stdout / stderr到有限大小的日志文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个聊天过程很多 stderr ,我想把这些东西记录到一个文件中。

  foo 2> /tmp/foo.log 

其实我正在用python 子进程启动它.Popen ,但它可能是从这个问题的目的shell。

 打开('/ tmp / foo.log','w')为stderr:
foo_proc = subprocess.Popen(['foo'],stderr = stderr)

问题在于几天之后,我的日志文件可能非常大,例如> 500 MB。我对所有的 stderr 聊天感兴趣,但只有最近的东西。我怎样才能限制日志文件的大小,比如1MB?该文件应该像一个循环缓冲区,最近的东西将被写入,但旧的东西应该掉出文件,以便它永远不会超过给定的大小。

我不确定是否有一种优雅的Unixey方法可以做到这一点,我只是不知道,有一些特殊的文件。

只要我不必中断正在运行的进程,使用日志轮转的替代解决方案也足以满足我的需求。

解决方案

您应该可以使用stdlib logging包来执行此操作。而不是直接将子过程的输出连接到一个文件,你可以这样做:

  import logging 

logger = logging.getLogger('foo')

def stream_reader(stream):
while:
line = stream.readline()
logger .debug('%s',line.strip())

该流,并且您可以使用提供日志文件旋转的 RotatingFileHandler 来配置日志记录。您可以安排读取这些数据并记录下来。



$ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $' .PIPE)

thread = threading.Thread(target = stream_reader,args =(foo_proc.stderr,))
thread.setDaemon(True)#optional
thread.start )

其他的东西

thread.join()#等待线程终止(守护进程可选)

当然,您也可以调用 stream_reader(foo_proc.stderr),但是我假设您可能还有其他工作这样做可以配置日志记录(代码只能执行一次):

$ b


  import logging,logging.handlers 

handler = logging.handlers.RotatingFileHandler('/ tmp / foo.log','a' ,100000,10)
logging.getLogger()。addHandler(handler)
logging.getLogger('foo').setLevel(logging.DEBUG)

这将创建e最多10个100K的文件名为foo.log(在foo.log.1,foo.log.2等循环之后,其中foo.log是最新的)。你也可以通过1000000,1给你只是foo.log和foo.log.1,当文件超过1000000字节的大小轮换发生。


I have a process which chats a lot to stderr, and I want to log that stuff to a file.

foo 2> /tmp/foo.log

Actually I'm launching it with python subprocess.Popen, but it may as well be from the shell for the purposes of this question.

with open('/tmp/foo.log', 'w') as stderr:
  foo_proc = subprocess.Popen(['foo'], stderr=stderr)

The problem is after a few days my log file can be very large, like >500 MB. I am interested in all that stderr chat, but only the recent stuff. How can I limit the size of the logfile to, say, 1 MB? The file should be a bit like a circular buffer in that the most recent stuff will be written but the older stuff should fall out of the file, so that it never goes above a given size.

I'm not sure if there's an elegant Unixey way to do this already which I'm simply not aware of, with some sort of special file.

An alternative solution with log rotation would be sufficient for my needs as well, as long as I don't have to interrupt the running process.

解决方案

You should be able to use the stdlib logging package to do this. Instead of connecting the subprocess' output directly to a file, you can do something like this:

import logging

logger = logging.getLogger('foo')

def stream_reader(stream):
    while True:
        line = stream.readline()
        logger.debug('%s', line.strip())

This just logs every line received from the stream, and you can configure logging with a RotatingFileHandler which provides log file rotation. You then arrange to read this data and log it.

foo_proc = subprocess.Popen(['foo'], stderr=subprocess.PIPE)

thread = threading.Thread(target=stream_reader, args=(foo_proc.stderr,))
thread.setDaemon(True) # optional 
thread.start()

# do other stuff

thread.join() # await thread termination (optional for daemons)

Of course you can call stream_reader(foo_proc.stderr) too, but I'm assuming you might have other work to do while the foo subprocess does its stuff.

Here's one way you could configure logging (code that should only be executed once):

import logging, logging.handlers

handler = logging.handlers.RotatingFileHandler('/tmp/foo.log', 'a', 100000, 10)
logging.getLogger().addHandler(handler)
logging.getLogger('foo').setLevel(logging.DEBUG)

This will create up to 10 files of 100K named foo.log (and after rotation foo.log.1, foo.log.2 etc., where foo.log is the latest). You could also pass in 1000000, 1 to give you just foo.log and foo.log.1, where the rotation happens when the file would exceed 1000000 bytes in size.

这篇关于子过程stdout / stderr到有限大小的日志文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆