如何将python子进程stderr和stdout重定向到多个文件? [英] How to redirect python subprocess stderr and stdout to multiple files?
问题描述
我只想将 stderr 和 stdout 重定向到多个文件.例如:stderr 应该重定向到 file_1 和 file_2.
我正在使用下面将输出重定向到单个文件.
subprocess.Popen("my_commands",shell=True,stdout=log_file,stderr=err_file,executable="/bin/bash")
以上内容将 stdout
和 stderr
重定向到单个文件.
谁能告诉我这样做的方法(将输出重定向到两个文件 log_file 和 err_file 例如 stdout
应该重定向到 log_file 和 err_file 而 stderr
应该重定向到 err_file 和 new_file)
您可以创建自己的类似文件的类来写入多个文件句柄.这是一个简单的示例,其中包含一个重定向 sys.stdout
和 sys.stderr
的测试.
导入系统类 MultiOut(对象):def __init__(self, *args):self.handles = argsdef write(self, s):对于 self.handles 中的 f:f.写open('q1', 'w') 作为 f1, open('q2', 'w') 作为 f2, open('q3', 'w') 作为 f3:sys.stdout = MultiOut(f1, f2)sys.stderr = MultiOut(f3, f2)对于 i, c in enumerate('abcde'):打印(c,'输出')打印(我,'错误',文件=sys.stderr)
运行该代码后,这些文件包含以下内容:
q1
一个输出出输出出输出
q3
0 错误1 错误2 错误3 错误4 错误
q2
一个输出0 错误出1 错误输出2 错误出3 错误输出4 错误
<小时>
FWIW,如果您愿意,您甚至可以这样做:
sys.stdout = MultiOut(f1, f2, sys.stdout)sys.stderr = MultiOut(f3, f2, sys.stderr)
<小时>
不幸的是,像 MultiOut
这样的类文件对象不能与 Popen
一起使用,因为 Popen
通过底层操作系统文件描述符访问文件,即,它想要操作系统认为是文件的东西,所以只有提供有效 fileno
方法的 Python 对象才能用于 Popen
的文件参数.>
相反,我们可以使用 Python 3 的 asyncio
用于执行 shell 命令并同时复制其 stdout 和 stderr 输出的功能.
首先,这是一个简单的 Bash 脚本,我用来测试以下 Python 代码.它只是循环遍历一个数组,将数组内容回显到 stdout,将数组索引回显到 stderr,就像前面的 Python 示例一样.
multitest.bsh
#!/usr/bin/env basha=(a b c d e)for((i=0; i<${#a[@]}; i++))做回声输出:${a[i]}"echo "ERR: $i" >&2睡眠 0.01完毕
输出
OUT: a错误:0输出:b错误:1输出:c错误:2输出:d错误:3输出:e错误:4
这是运行 multitest.bsh 的 Python 3 代码,将其 stdout 输出传送到文件 q1 和 q2,并将其 stderr 输出传送到 q3 和 q2.
import asyncio从 asyncio.subprocess 导入 PIPE类 MultiOut(对象):def __init__(self, *args):self.handles = argsdef write(self, s):对于 self.handles 中的 f:f.写定义关闭(自我):经过@asyncio.coroutinedef copy_stream(流,输出文件):""" 从流中逐行读取直到 EOF,将其复制到输出文件."""为真:line = 从 stream.readline() 产生如果不是行:休息outfile.write(line) # 假设它没有阻塞@asyncio.coroutinedef run_and_pipe(cmd, fout, ferr):# 启动进程process = yield from asyncio.create_subprocess_shell(cmd,stdout=PIPE, stderr=PIPE, 可执行文件="/bin/bash")# 同时读取孩子的 stdout/stderr尝试:从 asyncio.gather(copy_stream(process.stdout, fout),copy_stream(process.stderr, ferr))除了例外:进程.kill()增加最后:# 等待进程退出rc = 从 process.wait() 产生返回# 运行事件循环loop = asyncio.get_event_loop()用 open('q1', 'wb') 作为 f1, open('q2', 'wb') 作为 f2, open('q3', 'wb') 作为 f3:fout = MultiOut(f1, f2)ferr = MultiOut(f3, f2)rc = loop.run_until_complete(run_and_pipe("./multitest.bsh", fout, ferr))循环关闭()print('返回代码:', rc)
运行代码后,这些文件包含以下内容:
q1
OUT: a输出:b输出:c输出:d输出:e
q3
错误:0错误:1错误:2错误:3错误:4
q2
OUT: a错误:0输出:b错误:1输出:c错误:2输出:d错误:3输出:e错误:4
异步代码取自 J.F.塞巴斯蒂安对问题的回答Subprocess.Popen:将 stdout 和 stderr 克隆到终端和变量.谢谢,J.F!
请注意,当数据可用于调度的协程时,数据将写入文件;确切何时发生取决于当前的系统负载.所以我将 sleep 0.01
命令放在 multitest.bsh 中,以保持 stdout 和 stderr 行的处理同步.如果没有延迟,q2 中的 stdout 和 stderr 行通常不会很好地交错.可能有更好的方法来实现这种同步,但我仍然是异步编程的新手.
I just want to redirect stderr and stdout to multiple files. For example: stderr should redirected to file_1 and file_2.
I am using below to redirect output to single file.
subprocess.Popen("my_commands",shell=True,stdout=log_file,stderr=err_file,executable="/bin/bash")
Above thing redirects stdout
and stderr
to a single file.
Can anybody tell the way to do same(redirect output to both files log_file and err_file e.g. stdout
should redirect to both log_file and err_file and stderr
should redirect to err_file and new_file)
You can create your own file-like class that writes to multiple file handles. Here's a simple example, with a test that redirects sys.stdout
and sys.stderr
.
import sys
class MultiOut(object):
def __init__(self, *args):
self.handles = args
def write(self, s):
for f in self.handles:
f.write(s)
with open('q1', 'w') as f1, open('q2', 'w') as f2, open('q3', 'w') as f3:
sys.stdout = MultiOut(f1, f2)
sys.stderr = MultiOut(f3, f2)
for i, c in enumerate('abcde'):
print(c, 'out')
print(i, 'err', file=sys.stderr)
After running that code, here's what those files contain:
q1
a out
b out
c out
d out
e out
q3
0 err
1 err
2 err
3 err
4 err
q2
a out
0 err
b out
1 err
c out
2 err
d out
3 err
e out
4 err
FWIW, you can even do this, if you like:
sys.stdout = MultiOut(f1, f2, sys.stdout)
sys.stderr = MultiOut(f3, f2, sys.stderr)
Unfortunately, file-like objects like MultiOut
can't be used with Popen
because Popen
accesses files via the underlying OS file descriptor, i.e., it wants something that the OS considers to be a file, so only Python objects that supply a valid fileno
method can be used for Popen
's file arguments.
Instead, we can use Python 3's asyncio
features to execute the shell command and to copy its stdout and stderr output concurrently.
Firstly, here's a simple Bash script that I used to test the following Python code. It simply loops over an array, echoing the array contents to stdout and the array indices to stderr, like the previous Python example.
multitest.bsh
#!/usr/bin/env bash
a=(a b c d e)
for((i=0; i<${#a[@]}; i++))
do
echo "OUT: ${a[i]}"
echo "ERR: $i" >&2
sleep 0.01
done
output
OUT: a
ERR: 0
OUT: b
ERR: 1
OUT: c
ERR: 2
OUT: d
ERR: 3
OUT: e
ERR: 4
And here's Python 3 code that runs multitest.bsh, piping its stdout output to files q1 and q2, and its stderr output to q3 and q2.
import asyncio
from asyncio.subprocess import PIPE
class MultiOut(object):
def __init__(self, *args):
self.handles = args
def write(self, s):
for f in self.handles:
f.write(s)
def close(self):
pass
@asyncio.coroutine
def copy_stream(stream, outfile):
""" Read from stream line by line until EOF, copying it to outfile. """
while True:
line = yield from stream.readline()
if not line:
break
outfile.write(line) # assume it doesn't block
@asyncio.coroutine
def run_and_pipe(cmd, fout, ferr):
# start process
process = yield from asyncio.create_subprocess_shell(cmd,
stdout=PIPE, stderr=PIPE, executable="/bin/bash")
# read child's stdout/stderr concurrently
try:
yield from asyncio.gather(
copy_stream(process.stdout, fout),
copy_stream(process.stderr, ferr))
except Exception:
process.kill()
raise
finally:
# wait for the process to exit
rc = yield from process.wait()
return rc
# run the event loop
loop = asyncio.get_event_loop()
with open('q1', 'wb') as f1, open('q2', 'wb') as f2, open('q3', 'wb') as f3:
fout = MultiOut(f1, f2)
ferr = MultiOut(f3, f2)
rc = loop.run_until_complete(run_and_pipe("./multitest.bsh", fout, ferr))
loop.close()
print('Return code:', rc)
After running the code, here's what those files contain:
q1
OUT: a
OUT: b
OUT: c
OUT: d
OUT: e
q3
ERR: 0
ERR: 1
ERR: 2
ERR: 3
ERR: 4
q2
OUT: a
ERR: 0
OUT: b
ERR: 1
OUT: c
ERR: 2
OUT: d
ERR: 3
OUT: e
ERR: 4
The asyncio code was lifted from J.F. Sebastian's answer to the question Subprocess.Popen: cloning stdout and stderr both to terminal and variables. Thanks, J.F!
Note that data is written to the files when it becomes available to the scheduled coroutines; exactly when that happens depends on the current system load. So I put the sleep 0.01
command in multitest.bsh to keep the processing of stdout and stderr lines synchronised. Without that delay the stdout and stderr lines in q2 generally won't be nicely interleaved. There may be a better way to achieve that synchronisation, but I'm still very much a novice with asyncio programming.
这篇关于如何将python子进程stderr和stdout重定向到多个文件?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!