包装子进程的标准输出/标准错误 [英] Wrap subprocess' stdout/stderr

查看:30
本文介绍了包装子进程的标准输出/标准错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想捕获并显示我通过 Python 的子进程调用的进程的输出.

I'd like to both capture and display the output of a process that I invoke through Python's subprocess.

我以为我可以将我的类文件对象作为命名参数 stdout 和 stderr 传递

I thought I could just pass my file-like object as named parameter stdout and stderr

我可以看到它访问 fileno 属性 - 所以它正在对对象做一些事情.但是,永远不会调用 write() 方法.我的方法是完全关闭还是我只是错过了什么?

I can see that it accesses the filenoattribute - so it is doing something with the object. However, the write() method is never invoked. Is my approach completely off or am I just missing something?

class Process(object):
    class StreamWrapper(object):
        def __init__(self, stream):
            self._stream = stream
            self._buffer = []
        def _print(self, msg):
            print repr(self), msg
        def __getattr__(self, name):
            if not name in ['fileno']:
                self._print("# Redirecting: %s" % name)
            return getattr(self._stream, name)
        def write(self, data):
            print "###########"
            self._buffer.append(data)
            self._stream.write(data)
            self._stream.flush()
        def getBuffer(self):
            return self._buffer[:]
    def __init__(self, *args, **kwargs):
        print ">> Running `%s`" % " ".join(args[0])
        self._stdout = self.StreamWrapper(sys.stdout)
        self._stderr = self.StreamWrapper(sys.stderr)
        kwargs.setdefault('stdout', self._stdout)
        kwargs.setdefault('stderr', self._stderr)
        self._process = subprocess.Popen(*args, **kwargs)
        self._process.communicate()

更新:

我也想使用 ANSI 控制字符来移动光标并覆盖以前的输出内容.我不知道这是否是正确的术语,但这里是我的意思的一个例子:我正在尝试自动化一些 GIT 的东西,并且他们有自己更新的进度,而不是每次都写一个新的行.

Update:

Something I'd like to work as well, is the ANSI control characters to move the cursor and override previously output stuff. I don't know whether that is the correct term, but here's an example of what I meant: I'm trying to automate some GIT stuff and there they have the progress that updates itself without writing to a new line each time.

对我来说,立即显示子流程的输出很重要.我尝试使用 subprocess.PIPE 来捕获输出,并手动显示它,但我只能在进程完成后让它显示输出.不过,我想实时查看输出.

It is important to me, that the output of the subprocess is displayed immediately. I've tried using subprocess.PIPE to capture the output, and display it manually, but I was only able to get it to display the output, once the process had completed. However, I'd like to see the output in real-time.

推荐答案

一个进程的stdin、stdout和stderr需要是真实的文件描述符.(这实际上不是 Python 施加的限制,而是管道在操作系统级别上的工作方式.)所以你需要一个不同的解决方案.

Stdin, stdout and stderr of a process need to be real file descriptors. (That is actually not a restriction imposed by Python, but rather how pipes work on the OS level.) So you will need a different solution.

如果您想实时跟踪 stdoutstderr,则需要异步 I/O 或线程.

If you want to track both stdout an stderr in real time, you will need asynchronous I/O or threads.

  • 异步 I/O:使用标准同步(=阻塞)I/O,对其中一个流的读取可能会阻塞,不允许实时访问另一个流.如果你在 Unix 上,你可以使用 这个答案.但是,在 Windows 上,您将无法使用这种方法.这个视频.

  • Asynchronous I/O: With the standard synchronous (=blocking) I/O, a read to one of the streams could block, disallowing access to the other one in real time. If you are on Unix, you can use non-blocking I/O as described in this answer. However, on Windows you will be out of luck with this approach. More on asynchronous I/O in Python and some alternatives are shown in this video.

线程: 处理这个问题的另一种常用方法是为每个要实时读取的文件描述符创建一个线程.线程只处理分配给它们的文件描述符,因此阻塞 I/O 不会造成伤害.

Threads: Another common way to deal with this problem is to create one thread for each file descriptor you want to read from in real time. The threads only handle the file descriptor they are assinged to, so blocking I/O won't harm.

这篇关于包装子进程的标准输出/标准错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆