Python代码覆盖率和多处理 [英] Python Code Coverage and Multiprocessing

查看:179
本文介绍了Python代码覆盖率和多处理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我结合使用工作服

I use coveralls in combination with coverage.py to track python code coverage of my testing scripts. I use the following commands:

coverage run --parallel-mode --source=mysource --omit=*/stuff/idont/need.py ./mysource/tests/run_all_tests.py
coverage combine
coveralls --verbose

除了多重处理以外,该方法都非常有效.不会跟踪工作池或子进程执行的代码.

This works quite nicely with the exception of multiprocessing. Code executed by worker pools or child processes is not tracked.

是否也可以跟踪多处理代码?我缺少任何特别的选择吗?也许每当产生一个新进程时,就将包装器添加到多处理库中以开始覆盖?

Is there a possibility to also track multiprocessing code? Any particular option I am missing? Maybe adding wrappers to the multiprocessing library to start coverage every time a new process is spawned?

编辑:

我(还有jonrsharpe,也:-)发现了用于多处理的猴子补丁.

I (and jonrsharpe, also :-) found a monkey-patch for multiprocessing.

但是,这对我不起作用,我的 Tracis-CI 版本几乎在开始.我在本地计算机上检查了该问题,显然将补丁添加到多处理过程中破坏了我的内存.使用此修复程序,占用少于1GB内存的测试需要超过16GB.

However, this does not work for me, my Tracis-CI build is killed almost right after the start. I checked the problem on my local machine and apparently adding the patch to multiprocessing busts my memory. Tests that take much less than 1GB of memory need more than 16GB with this fix.

EDIT2 :

猴子补丁在稍作修改后即可工作:删除 config_file解析(config_file=os.environ['COVERAGE_PROCESS_START'])达到了目的.这解决了内存过大的问题.因此,相应的行就变成:

The monkey-patch does work after a small modification: Removing the config_file parsing (config_file=os.environ['COVERAGE_PROCESS_START']) did the trick. This solved the issue of the bloated memory. Accordingly, the corresponding line simply becomes:

cov = coverage(data_suffix=True)

推荐答案

Coverage 4.0包含一个命令行选项--concurrency=multiprocessing来处理此问题.之后必须使用coverage combine.例如,如果您的测试在regression_tests.py中,则只需在命令行中执行此操作:

Coverage 4.0 includes a command-line option --concurrency=multiprocessing to deal with this. You must use coverage combine afterward. For instance, if your tests are in regression_tests.py, then you would simply do this at the command line:

coverage run --concurrency=multiprocessing regression_tests.py
coverage combine

这篇关于Python代码覆盖率和多处理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆