如何迫使Python字典缩小? [英] How to force Python dictionary to shrink?
问题描述
我已经用其他语言体验过.现在,我在Python中遇到了同样的问题.我有一本字典,里面有很多CRUD动作.人们会假设从字典中删除元素应该减少它的内存占用.事实并非如此.一旦字典的大小增加(通常增加一倍),它就永远不会释放分配的内存.我已经进行了这个实验:
I have experienced that in other languages. Now I have the same problem in Python. I have a dictionary that has a lot of CRUD actions. One would assume that deleting elements from a dictionary should decrease the memory footprint of it. It's not the case. Once a dictionary grows in size (doubling usually), it never(?) releases allocated memory back. I have run this experiment:
import random
import sys
import uuid
a= {}
for i in range(0, 100000):
a[uuid.uuid4()] = uuid.uuid4()
if i % 1000 == 0:
print sys.getsizeof(a)
for i in range(0, 100000):
e = random.choice(a.keys())
del a[e]
if i % 1000 == 0:
print sys.getsizeof(a)
print len(a)
第一个循环的最后一行是6291736
.第二个循环的最后一行也是6291736
.字典的大小为0
.
The last line of the first loop is 6291736
. The last line of the second loop is 6291736
as well. And the size of the dictionary is 0
.
那么如何解决这个问题呢?有没有办法强制释放内存?
So how to tackle this issue? Is there a way to force release of memory?
PS:真的不需要随机-我玩了第二个循环的范围.
PS: don't really need to do random - I played with the range of the second loop.
推荐答案
执行此重新哈希处理"以使其占用较少内存的方法是创建新字典并将内容复制过来.
The way to do this "rehashing" so it uses less memory is to create a new dictionary and copy the content over.
此视频中对Python词典的实现进行了很好的解释:
The Python dictionary implementation is explained really well in this video:
有一个与会者问了同样的问题( https://youtu.be/C4Kc8xzcA68?t=1593 ),而演讲者给出的答案是:
There is an atendee asking this same question (https://youtu.be/C4Kc8xzcA68?t=1593), and the answer given by the speaker is:
调整大小仅在插入时计算;随着字典的缩小,它只会获得大量的虚拟条目,而当您重新填充它时,它将开始重新使用那些来存储密钥. [...]您必须将键和值复制到新词典中
Resizes are only calculated upon insertion; as a dictionary shrinks it just gains a lot of dummy entries and as you refill it will just start reusing those to store keys. [...] you have to copy the keys and values out to a new dictionary
这篇关于如何迫使Python字典缩小?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!