1

I am currently trying to debug the root cause of an out-of-memory condition in a Fortran-based simulation program, running on Linux, compiled with Intel Fortran. The program is large and developed over many years, so I have only partial knowledge of the implementation.

A memory leak is unlikely; The program is more likely simply running out of memory due to the size of the model. However, I don't expect this for the model at hand. I suspect, that somewhere in the call-stack a global variable is consuming much more memory than I expect for this model.

What tools are available to analyze the memory consumption of the program at the time of the out-of-memory error?

Does e.g. gdb have features for that? I know that I can use it to analyze local variables, but in this program the main memory consumption is likely in heap memory managed as global variables.

kdb
  • 4,098
  • 26
  • 49
  • 1
    Turn on whatever debugging options Intel supports. If you're running on linux, it should be easy to install gfortran. You can then compile with `-fcheck=all -ffpe-trap=invalid,zero -Wall -fmax-errors=1` to see if an issue is found. – steve Sep 08 '22 at 17:01
  • 1
    You could look into valgrind's massif tool (https://valgrind.org/docs/manual/ms-manual.html) and/or gnu's memory debugging tools (https://www.gnu.org/software/libc/manual/html_node/Allocation-Debugging.html) – Ian Bush Sep 08 '22 at 22:05
  • A simple way is to run the program in a debugger, set some breakpoints where you think it's relevant, and at each breakpoint check the occupied memory by the process with the `ps` or `top` commands. – PierU Sep 09 '22 at 08:16
  • Meanwhile, internal discussions pointed to the crash being caused by the limited stack size (and Fortran allocating large temporary arrays on the stack). – kdb Sep 09 '22 at 08:24

0 Answers0