0

If you let the program below run undefinitely, it eventually uses up all RAM and the OS starts swapping (it took ~5 minutes to take up 64GB in my workstation). If it is true what has been answered here: When does the heap memory actually get released?, then why does the OS go swap instead of reclaiming supposedly unused RAM?

As you can see in the code below, the large vector is supposedly to be freed by the end of each loop (goes out of scope). So, I expected the program to occupy the same ammount of RAM all the time.

OBS: If you have less memory, you can set the nIter number to a lower figure.

#include <iostream>
#include <vector>

using namespace std;

int main()
{
    int nIter = 700000000;
    while(true){
        std::vector< std::vector<double> > data;
        data.reserve(nIter);
        std::vector<double> dataLine( 7, 0.0 );
        for( int i = 0; i < nIter; ++i ){
            data.push_back( std::vector<double>() );
            data.back() = dataLine;
        }
    } //End of scope.

    return 0;
}
Paulo Carvalho
  • 554
  • 5
  • 10
  • This code has undefined behavior. It's an infinite loop with no observable behavior. See [process guarantees](https://en.cppreference.com/w/cpp/language/memory_model#Progress_guarantee). – François Andrieux Sep 18 '18 at 16:25
  • 2
    Are you sure it isn't failing on the first iteration? Edit : Your total memory requirement is at least 30 GB, at least on my platform. – François Andrieux Sep 18 '18 at 16:29
  • @FrançoisAndrieux 7 (columns) * 8 (double) * 700,000,000 (lines) should give that figure, yes. I didn't notice it, so I guess it's even more strange. Well you can observe the result by monitoring resource consumption in your OS (`top` command in Linux, Task Manager in Windows, etc.) – Paulo Carvalho Sep 18 '18 at 16:36
  • I can't, because when I tried it, it failed the first time it tried to `data.reserve(700000000);`. Edit : You may simply be seeing the increase from each new element allocating it's data in the `for` loop. It likely runs out of RAM during that process in the first iteration of the `while` loop. – François Andrieux Sep 18 '18 at 16:39
  • 4
    I cannot reproduce on my machine. Had to drop it to `7000000` though to get it to not throw a bad_alloc exception. memory fluctuates up and down but doesn't slowly increase. – NathanOliver Sep 18 '18 at 16:39
  • @NathanOliver That's what I was expecting: a constant memory consuption. I'm running it on CentOS 6 64. – Paulo Carvalho Sep 18 '18 at 16:43
  • @PauloCarvalho If you change `nIter` to `7000000` like I did does it still increase or stay stable? – NathanOliver Sep 18 '18 at 16:45
  • Works OK for me on CentOS 6.10 x86_64, with a smaller size. 700000000 throws `bad_alloc` – Jonathan Wakely Sep 18 '18 at 16:45
  • @NathanOliver I guess you nailed it. It fluctuates between 80MB and 500MB. – Paulo Carvalho Sep 18 '18 at 16:47
  • My point is that I'm trying to figure out what was going on here: https://stackoverflow.com/questions/52388325/apparent-memory-leak-with-stdvector-of-stdvectors – Paulo Carvalho Sep 18 '18 at 16:50
  • @FrançoisAndrieux That's a likely explanation as to why it fails later. – Paulo Carvalho Sep 18 '18 at 19:23

0 Answers0