I am currently trying to debug the root cause of an out-of-memory condition in a Fortran-based simulation program, running on Linux, compiled with Intel Fortran. The program is large and developed over many years, so I have only partial knowledge of the implementation.
A memory leak is unlikely; The program is more likely simply running out of memory due to the size of the model. However, I don't expect this for the model at hand. I suspect, that somewhere in the call-stack a global variable is consuming much more memory than I expect for this model.
What tools are available to analyze the memory consumption of the program at the time of the out-of-memory error?
Does e.g. gdb
have features for that? I know that I can use it to analyze local variables, but in this program the main memory consumption is likely in heap memory managed as global variables.