I'm using python x32 on x32 win xp
sometimes program fails on line
fp = np.memmap('C:/memmap_test', dtype='float32', mode='w+', shape=(rows,cols))
error in memmap.py
Traceback (most recent call last):
fp = np.memmap('C:/memmap_test', dtype='float32', mode='w+', shape=(rows,cols)) File "C:\Python27\lib\site-packages\numpy\core\memmap.py", line 253, in __new__
mm = mmap.mmap(fid.fileno(), bytes, access=acc, offset=start)
OverflowError: cannot fit 'long' into an index-sized integer
so I assume that there is limitation on size of the array, so what is the max size of array maxN = rows*cols?
Also the same qeuestion for 1. python x32 win x64 and 2. python x64 win x64.
UPDATE:
#create array
rows= 250000
cols= 1000
fA= np.memmap('A.npy', dtype='float32', mode='w+', shape=(rows,cols))
# fA1= np.memmap('A1.npy', dtype='float32', mode='w+', shape=(rows,cols)) # can't create another one big memmap
print fA.nbytes/1024/1024 # 953 mb
so it seems there is another limitations not only <2Gb for single memmaped array.
also output for test provided by @Paul
working with 30000000 elements
number bytes required 0.240000 GB
works
working with 300000000 elements
number bytes required 2.400000 GB
OverflowError("cannot fit 'long' into an index-sized integer",)
working with 3000000000 elements
number bytes required 24.000000 GB
IOError(28, 'No space left on device')
working with 30000000000 elements
number bytes required 240.000000 GB
IOError(28, 'No space left on device')
working with 300000000000 elements
number bytes required 2400.000000 GB
IOError(28, 'No space left on device')
working with 3000000000000 elements
number bytes required 24000.000000 GB
IOError(22, 'Invalid argument')