在Python异步后台进程? [英] Asynchronous background processes in Python?

查看:159
本文介绍了在Python异步后台进程?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直用这个作为参考,但不能够完成正是我需要的:<一href=\"http://stackoverflow.com/questions/89228/how-to-call-external-command-in-python/92395#92395\">http://stackoverflow.com/questions/89228/how-to-call-external-command-in-python/92395#92395

I have been using this as a reference, but not able to accomplish exactly what I need: http://stackoverflow.com/questions/89228/how-to-call-external-command-in-python/92395#92395

我还阅读本: http://www.python.org/dev/peps/pep- 3145 /

有关我们的项目,我们有需要更新之前,我们可以部署我们的应用程序5 SVN检出。在我的开发环境中,迅速的部署是生产率比生产部署一点更重要,我一直在加速过程中的工作。

For our project, we have 5 svn checkouts that need to update before we can deploy our application. In my dev environment, where speedy deployments are a bit more important for productivity than a production deployment, I have been working on speeding up the process.

我有一个已经工作体面,但有一定的局限性bash脚本。我火了多个SVN更新与下面的bash命令:

I have a bash script that has been working decently but has some limitations. I fire up multiple 'svn updates' with the following bash command:

(svn update /repo1) & (svn update /repo2) & (svn update /repo3) &

在平行这些都跑和它的作品pretty好。我也用这个模式在构建脚本的其余部分用于发射了每只蚂蚁构建,然后移动战争到Tomcat。

These all run in parallel and it works pretty well. I also use this pattern in the rest of the build script for firing off each ant build, then moving the wars to Tomcat.

不过,我有过停止部署,如果其中一个更新或构建失败无法控制的。

However, I have no control over stopping deployment if one of the updates or a build fails.

我重新写我与Python bash脚本,所以我有超过分支机构更多的控制和部署过程。

I'm re-writing my bash script with Python so I have more control over branches and the deployment process.

我使用subprocess.call()来断火了SVN更新/回购命令,但每个人依次行事。我尝试'(SVN更新/回购)及而那些所有消防关闭,但立即结果code的回报。所以,我没有办法确定一个特定的命令,在异步模式下失败或没有。

I am using subprocess.call() to fire off the 'svn update /repo' commands, but each one is acting sequentially. I try '(svn update /repo) &' and those all fire off, but the result code returns immediately. So I have no way to determine if a particular command fails or not in the asynchronous mode.

import subprocess

subprocess.call( 'svn update /repo1', shell=True )
subprocess.call( 'svn update /repo2', shell=True )
subprocess.call( 'svn update /repo3', shell=True )

我很想找到一种方法,有Python的断火每个UNIX命令,如果任何调用失败,在任何时候整个脚本将停止。

I'd love to find a way to have Python fire off each Unix command, and if any of the calls fails at any time the entire script stops.

推荐答案

不要使用壳= TRUE 。它将ne​​edlessy调用shell打电话给你的 SVN 程序,这将使你的shell的收益code,而不是SVN的的。

Don't use shell=True. It will needlessy invoke the shell to call your svn program, and that will give you the shell's return code instead of svn's.

repos = ['/repo1', '/repo2', '/repo3']
# launch 3 async calls:
procs = [subprocess.Popen(['svn', 'update', repo]) for repo in repos]
# wait.
for proc in procs:
    proc.wait()
# check for results:
if any(proc.returncode != 0 for proc in procs):
    print 'Something failed'

这篇关于在Python异步后台进程?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆