12

Title says it all.

Basically, I am getting tired of having to reset my comp every time I mistakenly make MATLAB use a large amount of RAM for a simulation with many parameters I am creating.

Is there a way to make it stop/error out if the RAM usage starts to exceed a specified percentage of my total RAM?

I know that I am put try/catches every where I allocate memory, but this is for a program that is already written, and also, for future reference, I want to be able to just set a parameter in the beginning and be done with it.

Is there a way?

Thanks!

Spacey
  • 2,941
  • 10
  • 47
  • 63

2 Answers2

8

You can set virtual memory quota for a process group. On Windows use a Job object. On *nix use ulimit. This works with any process, not just MatLab.

See

Community
  • 1
  • 1
Ben Voigt
  • 277,958
  • 43
  • 419
  • 720
  • Does the job object allow creating limits for current processes only, or would it also work with any future instances? – Jonas Jan 03 '13 at 18:41
  • 2
    @Jonas: It works on whatever processes you add to the job object. It isn't persistent on the executable file, if that's what you mean, but you can always write (or download; I'm sure someone has written such a thing already) a little launcher to configure a job object and start a MatLab instance inside that job. – Ben Voigt Jan 03 '13 at 19:47
  • Thanks for the explanation. I'll fix my answer. (+1 btw). – Jonas Jan 03 '13 at 19:54
  • @BenVoigt Thanks, however I am not quite sure what exactly to do to implement this... I looked over the windows link, and the link for the top answer, but I just got more confused. The answer with the app might be easier. Which one did you mean? – Spacey Jan 04 '13 at 16:21
7

The problem you see occurs when Matlab starts to use virtual memory. You should normally be able to kill the Matlab process via the Task Manager, but that's not always desirable. There is no simple Matlab-internal switch that will globally limit the maximum array size, unfortunately.

What you can do is to make the swap file size very small, so that Matlab can't really write much to it, but this may in turn affect the performance of other programs. Other, non-Matlab solutions would be to switch to Linux (where you can set memory limits for a program more easily, see @BenVoigt's answer for details on setting limits on both Windows and Linux), or to run everything in a virtual machine.

For future reference, in my simulations, I have a method (subfunction, if you don't want to do it OOP) at the beginning of my pre-allocation that calculates the estimated total memory usage given the simulation parameters (# of elements of all the large arrays I'll use times 8 for doubles is memory in bytes), and that throws an error when would use too much RAM.

Here's an example for a quick memory check. I know that I'm going to allocate 3 m-by-3-by-t arrays, and 5 m-by-t arrays, all of them double.

maxMemFrac = 0.8; %# I want to use at most 80% of the available memory

numElements = 3 * (m * 3 * t) + 5 * (m * t);
numBytesNeeded = numElements * 8; %# I use double

%# read available memory
[~,memStats] = memory;

if numBytesNeeded > memStats.PhysicalMemory.Available * maxMemFrac
   error('MYSIM:OUTOFMEMORY','too much memory would be needed')
end
Community
  • 1
  • 1
Jonas
  • 74,690
  • 10
  • 137
  • 177
  • Jonas, thanks for that answers, can you describe in more detail how you are making/using your RAM-calculator-method here? I think that will be a workable solution if I can use it as well. Thanks. – Spacey Jan 03 '13 at 18:00
  • Switch to Linux? For all we know, he could be running Linux already. – Ben Voigt Jan 03 '13 at 18:15
  • @BenVoigt I am on windows machine, I do not think switching to Lunix might be feasible for me now. – Spacey Jan 03 '13 at 18:19
  • @Learnaholic: I've added an example. – Jonas Jan 03 '13 at 18:39
  • @Jonas: It's a valid idea. My concern is that the pre-estimate of memory usage won't be accurate when it matters most: when there's a bug. – Ben Voigt Jan 03 '13 at 21:04