This code produces a
from pylab import complex128 import numpy x = numpy.empty(100000000, dtype=complex128) # 100 millions complex128
I have Win7 64 with 8 GB RAM (at least 5.3 GB free when running this code). I’m using Python 2.7 (Anaconda) and I think it is the 32 bits version. Even with 32bits, we should be able to handle 1.6 GB !
Do you know how to solve this ?
PS : I expected an array of 100 millions items, each of one using 16 bytes (128 bits) to use 16 * 100 millions = 1.6 GB. This is confirmed by :
x = numpy.empty(1000000, dtype=complex128) # 1 million here print x.nbytes >>> 16000000 # 16 MB
The problem was solved with Python 64bit.
It’s even possible to create a single array of more than 5 GB.
Note : when I create an array which should use 1 600 000 000 bytes (with 100 million items in a
complex128 array), the actual memory usage is not “much” more : 1 607 068 KB…