随着时间的流逝,Python进程内存不断增长 [英] growing python process memory over time

查看:1408
本文介绍了随着时间的流逝,Python进程内存不断增长的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的python代码处理内存在需要的地方动态增加,因为它在列表,字典和元组中存储动态数据.尽管此后将所有这些动态数据从其变量中物理清除,但内存并没有减少.

因此,我感到内存泄漏,因此我使用gc.collect()方法收集了所有未释放的内存.但是当变量中没有数据时,我无法将内存减至最少.

解决方案

通常,一个过程很难将内存返还给OS"(直到该过程终止并且OS收回 all 内存,当然),因为(在大多数实现中)malloc返回的内容都是为了提高效率而从大块中提取出来的,但是如果其中的任何部分仍在使用中,则无法将整个块都还给- -因此,大多数C标准库甚至都不尝试.

有关Python上下文中的正当讨论,请参见例如此处. Evan Jones解决了一些特定于Python的问题,如此处此处这里.

一个SO线程位于此处,休·艾伦(Hugh Allen)在他的回答中引用了Firefox程序员扩展到Mac OS X是一个基本上不可能的系统,该过程可以将内存归还给OS.

因此,只有通过终止进程,您才能确保释放其内存.例如,长时间运行的服务器有时可以将其状态快照到磁盘上并关闭(使用小的监视程序,系统或自定义程序,监视并重新启动它).如果您知道下一个操作将在短时间内占用大量内存,通常您可以os.fork,在子进程中执行需要大量内存的工作,然后将结果(如果有的话)通过以下方式返回给父进程:子进程终止时使用管道.依此类推,等等.

My python code process memory increases dynamically as it stores dynamic data in list, dictionary and tuples wherever necessary. Though all those dynamic data is cleared physically in their variables after then, the memory is not shooting down.

Hence i felt like there is a memory leak and i used gc.collect() method to collect all the unfreed memory. But i could not make the memory to minimum when there is no data in the variables.

解决方案

It's very hard, in general, for a process to "give memory back to the OS" (until the process terminates and the OS gets back all the memory, of course) because (in most implementation) what malloc returns is carved out of big blocks for efficiency, but the whole block can't be given back if any part of it is still in use -- so, most C standard libraries don't even try.

For a decent discussion in a Python context, see e.g. here. Evan Jones fixed some Python-specific issues as described here and here, but his patch is in the trunk since Python 2.5, so the problems you're observing are definitely with the system malloc package, not with Python per se. A 2.6-specific explanation is here and here.

A SO thread is here, where Hugh Allen in his answer quotes Firefox programmers to the extend that Mac OS X is a system where it's basically impossible for a process to give memory back to the OS.

So, only by terminating a process can you be sure to release its memory. For example, a long-running server, once in a while, could snapshot its state to disk and shut down (with a tiny watchdog process, system or custom, watching over it and restarting it). If you know that the next operation will take a lot of memory for a short time, often you can os.fork, do the memory-hungry work in the child process, and have results (if any) returned to the parent process via a pipe as the child process terminates. And so on, and so forth.

这篇关于随着时间的流逝,Python进程内存不断增长的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆