无法在Python中分配1.6 GB [英] Cannot allocate 1.6 GB in Python
问题描述
此代码产生一个MemoryError
:
from pylab import complex128
import numpy
x = numpy.empty(100000000, dtype=complex128) # 100 millions complex128
我有8 GB RAM的Win7 64(运行此代码时至少有5.3 GB可用空间).我正在使用Python 2.7(Anaconda),我认为它是32位版本.即使使用32位,我们也应该能够处理1.6 GB!
你知道如何解决这个问题吗?
PS:我期望一个数组,其中包含1亿个项目,每个项目使用16字节(128位)使用16 * 1亿= 1.6 GB.通过以下方式确认:
x = numpy.empty(1000000, dtype=complex128) # 1 million here
print x.nbytes
>>> 16000000 # 16 MB
该问题已通过Python 64位解决.
甚至可以创建一个大于5 GB的单个阵列.
注意:当我创建一个应该使用1 600000000字节(complex128
数组中有1亿个项目)的数组时,实际的内存使用量不是很多,而是:1 607 068 KB ... /p>
This code produces a MemoryError
:
from pylab import complex128
import numpy
x = numpy.empty(100000000, dtype=complex128) # 100 millions complex128
I have Win7 64 with 8 GB RAM (at least 5.3 GB free when running this code). I'm using Python 2.7 (Anaconda) and I think it is the 32 bits version. Even with 32bits, we should be able to handle 1.6 GB !
Do you know how to solve this ?
PS : I expected an array of 100 millions items, each of one using 16 bytes (128 bits) to use 16 * 100 millions = 1.6 GB. This is confirmed by :
x = numpy.empty(1000000, dtype=complex128) # 1 million here
print x.nbytes
>>> 16000000 # 16 MB
The problem was solved with Python 64bit.
It's even possible to create a single array of more than 5 GB.
Note : when I create an array which should use 1 600 000 000 bytes (with 100 million items in a complex128
array), the actual memory usage is not "much" more : 1 607 068 KB...
这篇关于无法在Python中分配1.6 GB的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!