26

Basically, I am getting a memory error in python when trying to perform an algebraic operation on a numpy matrix. The variable u, is a large matrix of double (in the failing case its a 288x288x156 matrix of doubles. I only get this error in this huge case, but I am able to do this on other large matrices, just not this big). Here is the Python error:

 Traceback (most recent call last):

 File "S:\3D_Simulation_Data\Patient SPM Segmentation\20 pc
t perim erosion flattop\SwSim.py", line 121, in __init__
   self.mainSimLoop()

 File "S:\3D_Simulation_Data\Patient SPM Segmentation\20 pc
t perim erosion flattop\SwSim.py", line 309, in mainSimLoop
   u = solver.solve_cg(u,b,tensors,param,fdHold,resid) # Solve the left hand si
de of the equation Au=b with conjugate gradient method to approximate u

 File "S:\3D_Simulation_Data\Patient SPM Segmentation\20 pc
t perim erosion flattop\conjugate_getb.py", line 47, in solv
e_cg

u = u + alpha*p

MemoryError

u = u + alpha*p is the line of code that fails.

alpha is just a double, while u and r are the large matrices described above (both of the same size).

I don't know that much about memory errors especially in Python. Any insight/tips into solving this would be very appreciated!

Thanks

ali_m
  • 71,714
  • 23
  • 223
  • 298
tylerthemiler
  • 5,496
  • 6
  • 32
  • 40

3 Answers3

52

Rewrite to

p *= alpha
u += p

and this will use much less memory. Whereas p = p*alpha allocates a whole new matrix for the result of p*alpha and then discards the old p; p*= alpha does the same thing in place.

In general, with big matrices, try to use op= assignment.

luispedro
  • 6,934
  • 4
  • 35
  • 45
12

Another tip I have found to avoid memory errors is to manually control garbage collection. When objects are deleted or go our of scope, the memory used for these variables isn't freed up until a garbage collection is performed. I have found with some of my code using large numpy arrays that I get a MemoryError, but that I can avoid this if I insert calls to gc.collect() at appropriate places.

You should only look into this option if using "op=" style operators etc doesn't solve your problem as it's probably not the best coding practice to have gc.collect() calls everywhere.

DaveP
  • 6,952
  • 1
  • 24
  • 37
  • Yeah, I ended up doing that. Thanks for the suggestion. – tylerthemiler Nov 30 '10 at 22:54
  • 1
    Why doesn't MemoryError trigger garbage collection automatically? – endolith Oct 31 '13 at 18:51
  • 3
    @endolith for the same reason that you can't pause and clear up memory when your `malloc()` fails--it's too late it _already_ failed. You could then double back, GC and try again, but I think the NumPy developers would rather you fix your code than rely on a band-aid. – PythonNut Sep 28 '14 at 03:26
  • Is the only downside running `gc.collect` frequently (as required) that it slows down the overall computation? – jds Jul 22 '15 at 14:22
  • 1
    @gwg - It is also specific to CPython, and I don't think `gc` even exists on Jython or IronPython. – DaveP Jul 22 '15 at 23:44
  • I didn't expect it, but this really solved my problem. – Eike P. Apr 18 '18 at 19:44
7

Your matrix has 288x288x156=12,939,264 entries, which for double could come out to 400MB in memory. numpy throwing a MemoryError at you just means that in the function you called the memory needed to perform the operation wasn't available from the OS.

If you can work with sparse matrices this might save you a lot of memory.

Benjamin Bannier
  • 55,163
  • 11
  • 60
  • 80
  • 2
    But my computer has 24GB of ram...is there a way to make sure more is available from windows?? Edit: the version of python we are using is 32-bit for some reason though :/ Edit2: Unfortunately, sparse matrices aren't an option, as there are values in all of the elements (heat equation like problem). – tylerthemiler Nov 30 '10 at 21:18
  • Thanks, I cleared some things from memory and I can now load this. – tylerthemiler Nov 30 '10 at 22:35
  • @tylerthemiler: Use the unofficial 64-bit builds http://www.lfd.uci.edu/~gohlke/pythonlibs/ – endolith Oct 31 '13 at 18:52