Python深getsizeof列表内容? [英] Python deep getsizeof list with contents?
问题描述
sys.getsizeof( 10000*[x] )
使我感到惊讶
不论x:0,"a",1000 *"a",{},都是40036.
是否有deep_getsizeof
哪个适当考虑了共享内存的元素?
(问题来自查看内存数据库表,例如
range(1000000)->省名:list或dict?)
(在Mac PPC上,Python是2.6.4.)
I was surprised that sys.getsizeof( 10000*[x] )
is 40036 regardless of x: 0, "a", 1000*"a", {}.
Is there a deep_getsizeof
which properly considers elements that share memory ?
(The question came from looking at in-memory database tables like
range(1000000) -> province names: list or dict ?)
(Python is 2.6.4 on a mac ppc.)
已添加: 10000 * ["密西西比]是10000个指向一个"密西西比的指针, 正如几个人指出的那样.试试这个:
Added: 10000*["Mississippi"] is 10000 pointers to one "Mississippi", as several people have pointed out. Try this:
nstates = [AlabamatoWyoming() for j in xrange(N)]
其中AlabamatoWyoming()->字符串"Alabama" .."Wyoming".
什么是deep_getsizeof(nstates)?
(我们怎么知道?
where AlabamatoWyoming() -> a string "Alabama" .. "Wyoming".
What's deep_getsizeof(nstates) ?
(How can we tell ?
- 适当的deep_getsizeof:困难,〜gc跟踪器
- 从总虚拟机中估算
- 关于python实现的内在知识
- 猜.
添加了25jan: 另请参见 when-does-python-allocate-new-memory- for-identical-strings
Added 25jan: see also when-does-python-allocate-new-memory-for-identical-strings
推荐答案
看看 guppy/heapy ;我自己并没有玩太多,但是我的一些同事已经将它用于内存分析,并取得了良好的结果.
Have a look at guppy/heapy; I haven't played around with it too much myself, but a few of my co-workers have used it for memory profiling with good results.
文档可能会更好,但是此方法在解释基本概念方面做得不错.
The documentation could be better, but this howto does a decent job of explaining the basic concepts.
这篇关于Python深getsizeof列表内容?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!