Python进程消耗了越来越多的系统内存,但堆显示了大致恒定的使用率 [英] Python process consuming increasing amounts of system memory, but heapy shows roughly constant usage

查看:40
本文介绍了Python进程消耗了越来越多的系统内存,但堆显示了大致恒定的使用率的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试识别正在处理的Python程序中的内存泄漏.我当前在Mac OS 64位上运行Python 2.7.4.我安装了很多东西以查找问题.

I'm trying to identify a memory leak in a Python program I'm working on. I'm current'y running Python 2.7.4 on Mac OS 64bit. I installed heapy to hunt down the problem.

该程序涉及使用货架模块创建,存储和读取大型数据库.我没有使用回写选项,我知道这会造成内存问题.

The program involves creating, storing, and reading large database using the shelve module. I am not using the writeback option, which I know can create memory problems.

在程序执行期间显示堆使用情况,内存大致保持不变.但是,我的活动监视器显示内存迅速增加.在15分钟内,该过程耗尽了我所有的系统内存(16gb),并且我开始看到页面输出.知道为什么大堆无法正确跟踪吗?

Heapy usage shows during the program execution, the memory is roughly constant. Yet, my activity monitor shows rapidly increasing memory. Within 15 minutes, the process has consumed all my system memory (16gb), and I start seeing page outs. Any idea why heapy isn't tracking this properly?

推荐答案

看看这篇精美的文章.您很可能没有看到内存泄漏,而是看到内存碎片.我发现最好的解决方法是确定大型工作集操作的实际输出是什么,将大型数据集加载到新流程中,计算输出,然后将该输出返回到原始流程.

Take a look at this fine article. You are, most likely, not seeing memory leaks but memory fragmentation. The best workaround I have found is to identify what the output of your large working set operation actually is, load the large dataset in a new process, calculate the output, and then return that output to the original process.

此答案也具有一些很好的见识和示例.在您的问题中,我认为没有任何东西可以阻止使用PyPy.

This answer has some great insight and an example, as well. I don't see anything in your question that seems like it would preclude the use of PyPy.

这篇关于Python进程消耗了越来越多的系统内存,但堆显示了大致恒定的使用率的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆