关于子进程的问题 [英] A question about subprocess

查看:72
本文介绍了关于子进程的问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述




我想在一大堆机器上发送我的工作(使用ssh)。

工作需要按以下方式运行:


(机器A)(机器B)(机器C)


工作A1工作B1工作C1


工作A2工作B2等


工作A3等





在机器A,B,C上运行的作业应该并行,但是,对于每台机器来说,作业应该运行一个接一个。


我怎么能用子流程来做?

谢谢,


JD

Hi,

I want send my jobs over a whole bunch of machines (using ssh). The
jobs will need to be run in the following pattern:

(Machine A) (Machine B) (Machine C)

Job A1 Job B1 Job C1

Job A2 Job B2 etc

Job A3 etc

etc

Jobs runing on machine A, B, C should be in parallel, however, for
each machine, jobs should run one after another.

How can I do it with the subprocess?
Thanks,

JD

推荐答案

JD< Ji ********* @ gmail.comwrote:
JD <Ji*********@gmail.comwrote:

如何使用子进程执行此操作?
How can I do it with the subprocess?



你不能。 Subprocess是一个库,用于在本地

机器上生成新进程。如果你想处理外部机器,你需要像

parallel python:< http://www.parallelpython.com/>


-

劳伦斯,oluyede.org - neropercaso.it

很难让男人理解

当他的工资取决于他的工资不是

理解它 - Upton Sinclair

You can''t. Subprocess is a library to spawn new processes on the local
machine. If you want to handle external machines you need something like
parallel python: <http://www.parallelpython.com/>

--
Lawrence, oluyede.org - neropercaso.it
"It is difficult to get a man to understand
something when his salary depends on not
understanding it" - Upton Sinclair


10月3日上午9:46,JD< Jiandong ... @ gmail.comwrote:
On Oct 3, 9:46 am, JD <Jiandong...@gmail.comwrote:




我想在一大堆机器上发送我的工作(使用ssh)。

工作需要按以下方式运行:


(机器A)(机器B)(机器C)


工作A1工作B1工作C1


工作A2工作B2等


工作A3等





在机器A,B,C上运行的作业应该并行,但是,对于每台机器来说,作业应该运行一个接一个。


我怎么能用子进程来做?
Hi,

I want send my jobs over a whole bunch of machines (using ssh). The
jobs will need to be run in the following pattern:

(Machine A) (Machine B) (Machine C)

Job A1 Job B1 Job C1

Job A2 Job B2 etc

Job A3 etc

etc

Jobs runing on machine A, B, C should be in parallel, however, for
each machine, jobs should run one after another.

How can I do it with the subprocess?



子流程不支持网络。你能做的就是写一个简单的

python脚本说run_jobs.py可以接受一个命令行

参数(比如A或B或C)并且会触发一系列子流程来执行一系列工作。这将确保序列化条件

就像A1在A1完成后开始一样。


现在你可以编写一个负载分配器类型的脚本,它使用ssh来

登录各种机器并使用适当的

参数运行run_jobs.py(这里我假设所有机器都可以访问run_jobs.py - 说

它可能驻留在共享的挂载文件系统上。)


例如在外部脚本:


ssh机器-A run_jobs.py A

ssh机器-B run_jobs.py B

ssh机器-B run_jobs.py C

....


你可能想要一次解雇所有这些以便它们全部执行

平行。


Karthik

subprocess is not network aware. What you can do is write a simple
python script say run_jobs.py which can take in a command-line
argument (say A or B or C) and will fire a sequence of subprocesses to
execute a series of jobs. This will ensure the serialization condition
like A2 starting after A1''s completion.

Now you can write a load distributer kind of script which uses ssh to
login to the various machines and run run_jobs.py with appropriate
argument (Here I assume all machines have access to run_jobs.py -- say
it may reside on a shared mounted file-system).

e.g. in outer script:

ssh machine-A run_jobs.py A
ssh machine-B run_jobs.py B
ssh machine-B run_jobs.py C
....

You may want to fire all these at once so that they all execute in
parallel.

Karthik


>

谢谢,


JD
>
Thanks,

JD



在消息< 11 ************* ********@22g2000hsm.googlegroups.c om> ;, JD写道:
In message <11*********************@22g2000hsm.googlegroups.c om>, JD wrote:

我想在一大堆机器上发送我的工作(使用ssh )。

工作需要按以下方式运行:


(机器A)(机器B)(机器C)


工作A1工作B1工作C1


工作A2工作B2等


工作A3等





在机器A,B,C上运行的作业应该并行,但是,对于每台机器来说,作业应该运行一个接一个。


我怎么能用子进程来做?
I want send my jobs over a whole bunch of machines (using ssh). The
jobs will need to be run in the following pattern:

(Machine A) (Machine B) (Machine C)

Job A1 Job B1 Job C1

Job A2 Job B2 etc

Job A3 etc

etc

Jobs runing on machine A, B, C should be in parallel, however, for
each machine, jobs should run one after another.

How can I do it with the subprocess?



你可以用SSH做到这一点。一个命令,比如


ssh machine_a run_job_a1.py


在远程执行run_job_a1.py之前不会终止

机器已经终止。所以你最终得到了很多代理 subprocesses,

如果你愿意,在主机上,每一个等待远程进程

在某台主机上终止。由于控制过程注意到每个代理进程终止

,它会查看

映射到哪个从机,并发送另一个命令来启动下一个作业机器。

You could do it with SSH. A command like

ssh machine_a run_job_a1.py

will not terminate until the execution of run_job_a1.py on the remote
machine has terminated. So you end up with a lot of "proxy" subprocesses,
if you like, on the master machine, each one waiting for a remote process
on some master machine to terminate. As the controlling process notices the
termination of each proxy process, it looks to see which slave machine that
maps to, and sends another command to start the next job on that machine.


这篇关于关于子进程的问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆