同时读取子进程 stdout 和 stderr [英] Read subprocess stdout and stderr concurrently

查看:49
本文介绍了同时读取子进程 stdout 和 stderr的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在 Python 中运行一个冗长的命令,该命令同时输出到 stdout 和 stderr.我想轮询子进程并将输出写入单独的文件.

I'm trying to run a lengthy command within Python that outputs to both stdout and stderr. I'd like to poll the subprocess and write the output to separate files.

我尝试了以下,基于这个答案非-python中的子进程.PIPE阻塞读取

I tried the following, based on this answer Non-blocking read on a subprocess.PIPE in python

import subprocess

from Queue import Queue, Empty
from threading import Thread

def send_cmd(cmd, shell=False):
    """
    Send cmd to the shell
    """
    if not isinstance(cmd, list): cmd = shlex.split(cmd)

    params = {'args'   : cmd,
              'stdout' : subprocess.PIPE,
              'stderr' : subprocess.PIPE,
              'shell'  : shell}

    proc = subprocess.Popen(**params)

    return proc

def monitor_command(process, stdout_log=os.devnull, stderr_log=os.devnull):
    """
    Monitor the process that is running, and log it if desired
    """
    def enqueue_output(out, queue):
        for line in iter(out.readline, b''):
            queue.put(line)

    def setup_process(log_name, proc):
        FID = open(log_name, 'w')
        queue = Queue()
        thread = Thread(target=enqueue_output, args=(proc, queue))
        thread.daemon = True # Thread dies with program
        thread.start()

        return (queue, FID)

    def check_queues(queue_list, errors):
        for queue, FID in queue_list:
            try:
                line = queue.get_nowait()
                if 'error' in line.lower() or 'failed' in line.lower():
                    errors.append(line)
            except Empty:
                pass
            else:
                FID.write(line)

    errors = []
    queue_list = []

    for log, proc in [(stdout_log, process.stdout), (stderr_log, process.stderr)]:
        queue_list.append(setup_process(log, proc)

    while process.poll() is None:
        check_queues(queue_list, errors)

    while not queue_list[0][0].empty() or queue_list[1][0].empty():
        check_queues(queue_list, errors)

    for queue, FID in queue_list:
        FID.close()

return errors

process = send_cmd('long_program.exe')
errors  = monitor_command(process, stdout_log='stdout.log', stderr_log='stderr.log')

但是stdout的输出文件是空的,stderr的输出文件只有几行长,而两者都应该很大.

But it the output file for stdout is empty, and the output file for stderr is only a few lines long, whereas both should be quite large.

我错过了什么?

推荐答案

代码看起来比任务需要的更复杂.我不明白你为什么需要在这里调用 process.poll()queue.get_nowait() .将子进程的 stdout/stderr 传送到多个接收器;您可以从接受任意类文件对象的 teed_call() 开始:您可以传递日志文件以及在其 .write() 方法中累积 errors 的特殊文件类对象.

The code looks more complicated than the task requires. I don't see why do you need to call process.poll() or queue.get_nowait() here. To deliver subprocess' stdout/stderr to several sinks; you could start with teed_call() that accepts arbitrary file-like objects: you could pass logfiles and special file-like objects that accumulates errors in theirs .write() methods.

以最少的更改修复您的代码;您应该在读取器线程上调用 .join()(即使 process.poll()not None即,子进程退出;可能有一些待处理的输出.加入读取器的线程可确保读取所有输出).

To fix your code with minimal changes; you should call .join() on the reader threads (even if process.poll() is not None i.e., the subprocess exited; there could be some pending output. Joining reader's threads ensures that all output is read).

这篇关于同时读取子进程 stdout 和 stderr的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆