it's my very first question here on stackoverflow. I have largely looked for a reason for what I'm experiencing with the following lines of code:
unsigned long long _mem1= getUsedVirtualMemory();
vector.erase(vector.begin() + _idx);
contained= false; // don't stop the loop
_idx--; // the removed object has redefined the idx to be consider.
_mem1 = getUsedVirtualMemory() - _mem1;
if (_mem1 > 0) printf("Memory - 2 mem1: %lu\n" , _mem1);
I have a huge memory consumption in my program and after an intensive debug session, some printfs and time consuming analyses, I arrived to this point:
getUsedVirtualMemory is implemented with the following code:
PROCESS_MEMORY_COUNTERS_EX pmc;
GetProcessMemoryInfo(GetCurrentProcess(), (PROCESS_MEMORY_COUNTERS*) &pmc, sizeof(pmc));
SIZE_T virtualMemUsedByMe = pmc.PrivateUsage;
return virtualMemUsedByMe;
to obtain the amount of virtual memory allocated by the process; the vector is a vector of objects (not pointers).
In most cases the vector's erase method seems to work as expected, but in some cases it looks like the erase method of that vector increases the memory used by the process instead of freeing it. I'm using the Windows system function GetProcessMemoryInfo in a lot of situations around the code to debug this problem and it seems to return an actual value for used virtual memory.
I'm using Visual Studio C++ 2010 Professional.
Let me know if more information are needed.
Thanks for any replies.
UPDATE:
Everything you wrote in your replies is correct and I forgot the following details:
- I already know that a vector has a size (actual number of elements) and a capacity (allocated slots to store elements)
- I already know that the erase method does not free memory (I looked for a lot of documentation about that method)
- finally, I will add other elements to that vector later, so I don't need to shrink that vector.
The actual problem is that in that case, the value of "_mem1" in the last line of code shows a difference of 1.600.000 bytes: unjustified increase of memory, while I expected to be 0 bytes.
Also in the case where the value of used memory after the operation would be less than the first one, I would expect a very big number for what is explained for instance at Is unsigned integer subtraction defined behavior?
Instead of the expected results I get a value greater than 0 but relatively short.
To better understand the incidence of the problem, iterating some thousands of times on that piece of code, unexpectedly allocates about 20 Gb of virtual memory.