1

I'm running a simulation of a diffusion-reaction equation in MATLAB, and I pre-allocate the memory for all of my vectors beforehand, however, during the loop, in which I solve a system of equations using BICG, the amount of memory that MATLAB uses is increasing.

For example:

concentration = zeros(N, iterations);

for t = 1:iterations
   concentration(:,t+1) = bicg(matrix, concentration(:,t));
end

As the program runs, the amount of memory MATLAB is using increases, which seems to suggest that the matrix, concentration, is increasing in size as the program continues, even though I pre-allocated the space. Is this because the elements in the matrix are becoming doubles instead of zeros? Is there a better way to pre-allocate the memory for this matrix, so that all of the memory the program requires will be pre-allocated at the start? It would be easier for me that way, because then I would know from the start how much memory the program will require and if the simulation will crash the computer or not.

Thanks for all your help, guys. I did some searching around and didn't find an answer, so I hope I'm not repeating a question.


EDIT:

Thanks Amro and stardt for your help guys. I tried running 'memory' in MATLAB, but the interpreter said that command is not supported for my system type. I re-ran the simulation though with 'whos concentration' displayed every 10 iterations, and the allocation size of the matrix wasn't changing with time. However, I did notice that the size of the matrix was about 1.5 GB. Even though that was the case, system monitor was only showing MATLAB as using 300 MB (but it increased steadily to reach a little over 1 GB by the end of the simulation). So I'm guessing that MATLAB pre-allocated the memory just fine and there are no memory leaks, but system monitor doesn't count the memory as in use until MATLAB starts writing values to it in the loop. I don't know why that would be, as I would imagine that writing zeros would trigger the system monitor to see that memory as 'in use,' but I guess that's not the case here.

Anyway, I appreciate your help with this. I would vote both of your answers up as I found them both helpful, but I don't have enough reputation points to do that. Thanks guys!

navr91
  • 236
  • 2
  • 13

2 Answers2

5

I really doubt it's a memory leak, since most "objects" in MATLAB clean after themselves once they go out of scope. AFAIK, MATLAB does not use a GC per se, but a deterministic approach to managing memory.

Therefore I suspect the issue is more likely to be caused by memory fragmentation: when MATLAB allocates memory for a matrix, it has to be contiguous. Thus when the function is repeatedly called, creating and deleting matrices, and over time, the fragmentation becomes a noticeable problem...

One thing that might help you debug is using the undocumented: profile on -memory which will track allocation in the MATLAB profiler. Check out the monitoring tool by Joe Conti as well. Also this page has some useful information.

Community
  • 1
  • 1
Amro
  • 123,847
  • 25
  • 243
  • 454
1

I am assuming that you are watching the memory usage of matlab in, for example, the task manager on windows. The memory usage is probably increasing due to the execution of bicg() and variables that have not been garbage collected after it ends. The memory allocated to the concentration matrix stays the same. You can type

whos concentration

before and after your "for" loop to see how much memory is allocated to that variable.

stardt
  • 1,179
  • 1
  • 9
  • 14
  • Oh, that makes sense! I'm on Ubuntu and watching the memory usage in System Monitor. There's no way to aid in the garbage collection after the execution of BICG? Because, the memory usage just steadily increases as the program runs, and I can't run my simulation long enough, because MATLAB chews through all my memory and then starts paging and it gets very, very slow. Also, I'd upvote your answer, but I'm new so I don't have enough reputation points to upvote yet. – navr91 Jul 30 '11 at 20:54
  • @navr91: From the documentation for "clear": On UNIX systems, clear does not affect the amount of memory allocated to the MATLAB process. This suggests that the memory allocated to matlab can only increase. There is also "pack": http://www.mathworks.com/support/tech-notes/1100/1106.html#3 – stardt Jul 30 '11 at 21:02
  • Ahh, okay, so it looks like maybe I could run the simulation for some time, save the final state, quit and reload MATLAB, and then run the simulation again with the final state from the previous simulation being the initial state of the next round. – navr91 Jul 30 '11 at 21:16
  • 1
    Do you actually get out of memory errors? As long as it isn't interfering with anything else, why does it matter if matlab has allocated a lot of memory? – stardt Jul 30 '11 at 22:03
  • The simulation eventually gets to the point where I have no more RAM left, and MATLAB starts using the swap... at that point it gets very slow. – navr91 Jul 30 '11 at 22:32
  • @navr91: can you give us a clearer picture of the platform you are working on? are you using a 64-bit version of MATLAB on a 64-bit OS? what is the output of memory tracking functions like `memory`? – Amro Jul 30 '11 at 23:01
  • @navr91: what is N and iterations? – stardt Jul 30 '11 at 23:15
  • @Amro: 64-bit version of MATLAB on a 64-bit OS. I guess I'll have to rerun the simulation to get the memory output. I didn't think of that. I'll do it again and let you guys know. Thanks for your continued support. – navr91 Jul 30 '11 at 23:15
  • @stardt: N=4e3, iterations=3e4 – navr91 Jul 30 '11 at 23:17