0

I realise, that when I am creating a large array in a Jupyter Notebook, e.g.

M = np.zeros([10000,2000,600], dtype = np.int64)

I got the message

MemoryError: Unable to allocate 89.4 GiB for an array with shape (10000, 2000, 600) and data type int64

although my workstation has got 128 GB of RAM. Why do I get this message and are there any restriction to the RAM due to settings in Jupyter?

The other question is, when I am creating two (as well as 10!) arrays with almost half of the size of M it works:

A = np.zeros([10000,1000,600], dtype = np.int64)
B = np.zeros([10000,1000,600], dtype = np.int64)
C = np.zeros([10000,1000,600], dtype = np.int64)
D = np.zeros([10000,1000,600], dtype = np.int64)

although every single array has a size of 48 GB (using the A.nbytes method) and this is obviously more than my available RAM.

How can this be explained?

  • Which OS? If Linux, see https://serverfault.com/questions/606185/how-does-vm-overcommit-memory-work – AKX Nov 20 '22 at 21:09

0 Answers0