使用python中的调用外部命令来控制子进程的数量 [英] Control the number of subprocesses using to call external commands in python
问题描述
我知道使用子进程是调用外部命令的首选方式.
I understand using subprocess is the preferred way of calling external command.
但是,如果我想同时运行几个命令,但又限制了产生的进程数怎么办?让我困扰的是我无法阻止子流程.例如,如果我致电
But what if I want to run several commands in parall, but limit the number of processes being spawned? What bothers me is that I can't block the subprocesses. For example, if I call
subprocess.Popen(cmd, stderr=outputfile, stdout=outputfile)
然后,该过程将继续,而无需等待cmd
完成.因此,我不能将其包装在multiprocessing
库的worker中.
Then the process will continue, without waiting for cmd
to finish. Therefore, I can't wrap it up in a worker of multiprocessing
library.
例如,如果我这样做:
def worker(cmd):
subprocess.Popen(cmd, stderr=outputfile, stdout=outputfile);
pool = Pool( processes = 10 );
results =[pool.apply_async(worker, [cmd]) for cmd in cmd_list];
ans = [res.get() for res in results];
然后每个工作人员将完成并在生成子进程后返回.因此,我不能真正限制使用Pool
由subprocess
生成的进程数.
then each worker will finish and return after spawning a subprocess. So I can't really limit the number of processes generated by subprocess
by using Pool
.
限制子进程数量的正确方法是什么?
What's the proper way of limiting the number of subprocesses?
推荐答案
如果要等待命令完成,可以使用subprocess.call
.有关更多信息,请参见 pydoc subprocess
.
You can use subprocess.call
if you want to wait for the command to complete. See pydoc subprocess
for more information.
您还可以在以下方法中调用 Popen.wait
方法您的工人:
You could also call the Popen.wait
method in your worker:
def worker(cmd):
p = subprocess.Popen(cmd, stderr=outputfile, stdout=outputfile);
p.wait()
这篇关于使用python中的调用外部命令来控制子进程的数量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!