Python多处理:如何可靠地从子进程重定向stdout? [英] Python multiprocessing: How can I RELIABLY redirect stdout from a child process?

查看:223
本文介绍了Python多处理:如何可靠地从子进程重定向stdout?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

NB.我已经看到 multiprocessing.Process的日志输出-不幸的是,它没有回答问题.

NB. I have seen Log output of multiprocessing.Process - unfortunately, it doesn't answer this question.

我正在通过多重处理创建一个子进程(在Windows上).我希望子进程的stdout和stderr输出的 all 全部重定向到日志文件,而不是出现在控制台上.我看到的唯一建议是让子进程将sys.stdout设置为文件.但是,由于Windows上stdout重定向的行为,这不能有效地重定向所有stdout输出.

I am creating a child process (on windows) via multiprocessing. I want all of the child process's stdout and stderr output to be redirected to a log file, rather than appearing at the console. The only suggestion I have seen is for the child process to set sys.stdout to a file. However, this does not effectively redirect all stdout output, due to the behaviour of stdout redirection on Windows.

为说明此问题,请使用以下代码构建Windows DLL

To illustrate the problem, build a Windows DLL with the following code

#include <iostream>

extern "C"
{
    __declspec(dllexport) void writeToStdOut()
    {
        std::cout << "Writing to STDOUT from test DLL" << std::endl;
    }
}

然后创建并运行如下所示的python脚本,该脚本将导入此DLL并调用该函数:

Then create and run a python script like the following, which imports this DLL and calls the function:

from ctypes import *
import sys

print
print "Writing to STDOUT from python, before redirect"
print
sys.stdout = open("stdout_redirect_log.txt", "w")
print "Writing to STDOUT from python, after redirect"

testdll = CDLL("Release/stdout_test.dll")
testdll.writeToStdOut()

为了看到与我相同的行为,可能有必要针对不同于Python使用的C运行时构建DLL.就我而言,python是用Visual Studio 2010构建的,而我的DLL是用VS 2005构建的.

In order to see the same behaviour as me, it is probably necessary for the DLL to be built against a different C runtime than than the one Python uses. In my case, python is built with Visual Studio 2010, but my DLL is built with VS 2005.

我看到的行为是控制台显示:

The behaviour I see is that the console shows:

> stdout_test.py

Writing to STDOUT from python, before redirect

Writing to STDOUT from test DLL

文件stdout_redirect_log.txt最终包含:

While the file stdout_redirect_log.txt ends up containing:

Writing to STDOUT from python, after redirect

换句话说,设置sys.stdout无法重定向DLL生成的stdout输出.鉴于Windows中用于stdout重定向的基础API的性质,这不足为奇.我以前在native/C ++级别上遇到过此问题,但从未找到一种可靠地从进程内重定向stdout的方法.它必须在外部完成.

In other words, setting sys.stdout failed to redirect the stdout output generated by the DLL. This is unsurprising given the nature of the underlying APIs for stdout redirection in Windows. I have encountered this problem at the native/C++ level before and never found a way to reliably redirect stdout from within a process. It has to be done externally.

这实际上是我启动子进程的根本原因-这样我就可以从外部连接到其子管道,从而保证我截获了它的所有输出.我绝对可以通过使用pywin32手动启动该过程来做到这一点,但是我非常希望能够使用多处理的功能,特别是通过多处理Pipe对象与子进程进行通信的能力,从而获得进展.更新.问题是,是否有任何方式可以同时使用多重处理对其IPC设施进行,以可靠地将孩子的所有stdout和stderr输出重定向到文件.

This is actually the very reason I am launching a child process - it's so that I can connect externally to its pipes and thus guarantee that I am intercepting all of its output. I can definitely do this by launching the process manually with pywin32, but I would very much like to be able to use the facilities of multiprocessing, in particular the ability to communicate with the child process via a multiprocessing Pipe object, in order to get progress updates. The question is whether there is any way to both use multiprocessing for its IPC facilities and to reliably redirect all of the child's stdout and stderr output to a file.

更新:查看用于multiprocessing.Processs的源代码,它有一个静态成员_Popen,看起来它可以用来覆盖用于创建进程的类.如果将其设置为无"(默认值),则使用multiprocessing.forking._Popen,但看起来像这样

UPDATE: Looking at the source code for multiprocessing.Processs, it has a static member, _Popen, which looks like it can be used to override the class used to create the process. If it's set to None (default), it uses a multiprocessing.forking._Popen, but it looks like by saying

multiprocessing.Process._Popen = MyPopenClass

我可以覆盖进程创建.但是,尽管我可以从multiprocessing.forking._Popen派生此函数,但似乎我必须将一堆内部内容复制到我的实现中,这听起来很不稳定,而且不太适合未来.如果那是唯一的选择,我想我可能会用pywin32手动完成全部操作.

I could override the process creation. However, although I could derive this from multiprocessing.forking._Popen, it looks like I would have to copy a bunch of internal stuff into my implementation, which sounds flaky and not very future-proof. If that's the only choice I think I'd probably plump for doing the whole thing manually with pywin32 instead.

推荐答案

您建议的解决方案是一个很好的解决方案:手动创建进程,以便您可以显式访问其stdout/stderr文件句柄.然后,您可以创建一个套接字来与子流程进行通信,并在该套接字上使用multiprocessing.connection(multiprocessing.Pipe创建相同类型的连接对象,因此应该为您提供所有相同的IPC功能).

The solution you suggest is a good one: create your processes manually such that you have explicit access to their stdout/stderr file handles. You can then create a socket to communicate with the sub-process and use multiprocessing.connection over that socket (multiprocessing.Pipe creates the same type of connection object, so this should give you all the same IPC functionality).

这是一个两个文件的示例.

Here's a two-file example.

master.py:

import multiprocessing.connection
import subprocess
import socket
import sys, os

## Listen for connection from remote process (and find free port number)
port = 10000
while True:
    try:
        l = multiprocessing.connection.Listener(('localhost', int(port)), authkey="secret")
        break
    except socket.error as ex:
        if ex.errno != 98:
            raise
        port += 1  ## if errno==98, then port is not available.

proc = subprocess.Popen((sys.executable, "subproc.py", str(port)), stdout=subprocess.PIPE, stderr=subprocess.PIPE)

## open connection for remote process
conn = l.accept()
conn.send([1, "asd", None])
print(proc.stdout.readline())

subproc.py:

import multiprocessing.connection
import subprocess
import sys, os, time

port = int(sys.argv[1])
conn = multiprocessing.connection.Client(('localhost', port), authkey="secret")

while True:
    try:
        obj = conn.recv()
        print("received: %s\n" % str(obj))
        sys.stdout.flush()
    except EOFError:  ## connection closed
        break

您可能还希望看到此问题的第一个答案问题以从子流程中获取非阻塞读取.

You may also want to see the first answer to this question to get non-blocking reads from the subprocess.

这篇关于Python多处理:如何可靠地从子进程重定向stdout?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆