Robert LaMarca <ro************@yahoo.comwrote:
Hi,
I am using numpy and wish to create very large arrays. My system is
AMD 64 x 2 Ubuntu 8.04. Ubuntu should be 64 bit. I have 3gb RAM and a
15 GB swap drive.
The command I have been trying to use is;
g=numpy.ones([1000,1000,1000],numpy.int32)
This returns a memory error.
Works for me on AMD64x2, 2GB RAM, 3GB swap. With much paging of course
;) Process size of python 4019508kB.
A smaller array ([500,500,500]) worked fine..
About 0.5GB.. no surprise.
Two smaller arrays again crashed the system.
Crash as in "computer reboots"? Strange. And you say, that two [500,
500, 500] arrays are two much?
So... I did the math. a 1000x1000x1000 array at 32 bits should be
around 4gb RAM... Obviously larger than RAM, but much smaller than the
swap drive.
Sounds like either your kernel or python are not 64bit.
For the kernel, have a look at /proc/meminfo, which value you get for
VmallocTotal. I have VmallocTotal: 34359738367 kB.
For python, check sys.maxint. If it's 2147483647, then you have a 32bit
python. Mine says 9223372036854775807.
1. So... does Numpy have a really lot of overhead? Or is my system
just not somehow getting to make use of the 15gb swap area.
No. Yes.
2. Is there a way I can access the swap area, or direct numpy to do
so? Or do I have to write out my own numpy cache system...
3. How difficult is it to use data compression internally on numpy
arrays?
2 + 3: Should not be necessary.
I just tried a 32bit python on my 64bit system, using Numeric instead of
numpy (don't have a 32bit numpy ready):
g=Numeric.zeros([1000,1000,1000],Numeric.Int32)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
MemoryError: can't allocate memory for array
g=Numeric.zeros([1000,1000,500],Numeric.Int32) succeeds.
So it looks like you have a 32bit kernel.
Marc