在Python中强制垃圾收集以释放内存 [英] Force garbage collection in Python to free memory

查看:195
本文介绍了在Python中强制垃圾收集以释放内存的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个Python2.7应用程序,它使用大量包含键和值的字符串的 dict 对象。



有时候这些字符和字符串不再需要,我想从内存中删除这些字符。



我尝试了不同的东西, del dict [key] del dict 等,但App仍然使用相同数量的内存。



下面是一个我期望费用记忆的例子。但它没有:($ / b>

  import gc 
导入资源

def mem( ):
print('Memory usage:%2.2f MB'%round(
resource.getrusage(resource.RUSAGE_SELF).ru_maxrss / 1024.0 / 1024.0,1)


mem()

print('...创建dicts列表...')
n = 10000
l = []
for i in xrange(n):
a = 1000 *'a'
b = 1000 *'b'
l.append({'a':a,'b':b})

mem()

print('...删除列表项...')

用于xrange(n)中的i:
l.pop(0)

mem()

print('GC收集的对象:%d'%gc.collect())

mem()

输出:

<$ p $内存使用:4.30 MB
...创建列表的字典...
内存使用情况:36.70 MB
...删除列表项...
内存使用情况:36.70 MB
GC收集的对象:0
内存使用情况:36.70 M B

我期望在这里可以收集一些对象并释放一些内存。 p>

我做错了什么?任何其他方式删除未使用的对象或至少找到意外使用对象的位置。 Frederick Lundh解释说, / p>


如果您创建一个大对象并再次删除它,Python可能会释放
的内存,不一定会将
内存返回给操作系统,所以它看起来好像Python进程使用的虚拟内存比实际使用的多


和Alex Martelli写道:


唯一真正可靠的方法可以确保暂时使用大量内存将所有资源返回给系统
就是让这种用法在一个子进程中发生,那么需要耗尽内存的工作
就会终止。



$ b $因此,你可以使用 multiprocessing 产生一个子进程,执行内存占用计算,然后确保在子进程终止时释放内存:

 导入多处理为mp 
导入资源

def mem():
print '内存使用率:%2.2f MB'%轮(
resource.getrusage(resource.RUSAGE_SELF).ru_maxrss / 1024.0,1)


mem()

def memoryhog():
print('...创建列表...')
n = 10 ** 5
l = []
for我在xrange(n):
a = 1000 *'a'
b = 1000 *'b'
l.append({'a':a,'b':b})
mem()

proc = mp.Process(target = memoryhog)
proc.start()
proc.join()

mem()





 内存使用量:5.80 MB 
...创建列表的列表...
内存使用情况:234.20 MB
内存使用情况:5.90 MB


I have a Python2.7 App which used lots of dict objects which mostly contain strings for keys and values.

Sometimes those dicts and strings are not needed anymore and I would like to remove those from memory.

I tried different things, del dict[key], del dict, etc. But the App still uses the same amount of memory.

Below a example which I would expect to fee the memory. But it doesn't :(

import gc
import resource

def mem():
    print('Memory usage         : % 2.2f MB' % round(
        resource.getrusage(resource.RUSAGE_SELF).ru_maxrss/1024.0/1024.0,1)
    )

mem()

print('...creating list of dicts...')
n = 10000
l = []
for i in xrange(n):
    a = 1000*'a'
    b = 1000*'b'
    l.append({ 'a' : a, 'b' : b })

mem()

print('...deleting list items...')

for i in xrange(n):
    l.pop(0)

mem()

print('GC collected objects : %d' % gc.collect())

mem()

Output:

Memory usage         :  4.30 MB
...creating list of dicts...
Memory usage         :  36.70 MB
...deleting list items...
Memory usage         :  36.70 MB
GC collected objects : 0
Memory usage         :  36.70 MB

I would expect here some objects to be 'collected' and some memory to be freed.

Am I doing something wrong? Any other ways to delete unused objects or a least to find where the objects are unexpectedly used.

解决方案

Frederick Lundh explains,

If you create a large object and delete it again, Python has probably released the memory, but the memory allocators involved don’t necessarily return the memory to the operating system, so it may look as if the Python process uses a lot more virtual memory than it actually uses.

and Alex Martelli writes:

The only really reliable way to ensure that a large but temporary use of memory DOES return all resources to the system when it's done, is to have that use happen in a subprocess, which does the memory-hungry work then terminates.

So, you could use multiprocessing to spawn a subprocess, perform the memory-hogging calculation, and then ensure the memory is released when the subprocess terminates:

import multiprocessing as mp
import resource

def mem():
    print('Memory usage         : % 2.2f MB' % round(
        resource.getrusage(resource.RUSAGE_SELF).ru_maxrss/1024.0,1)
    )

mem()

def memoryhog():
    print('...creating list of dicts...')
    n = 10**5
    l = []
    for i in xrange(n):
        a = 1000*'a'
        b = 1000*'b'
        l.append({ 'a' : a, 'b' : b })
    mem()

proc = mp.Process(target=memoryhog)
proc.start()
proc.join()

mem()

yields

Memory usage         :  5.80 MB
...creating list of dicts...
Memory usage         :  234.20 MB
Memory usage         :  5.90 MB

这篇关于在Python中强制垃圾收集以释放内存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆