I want to set up some per-user memory limits.conf on a Linux system, and for the sake of testing I wrote a minimal program:
#include <stdio.h>
#include <stdlib.h>
int main()
{
size_t mbs = 32;
char * leak;
while (1)
{
// leak = calloc(1024 * 1024 * mbs, sizeof(char));
leak = malloc(1024 * 1024 * mbs * sizeof(char));
printf("%9lu MiB allocated at address %p.", mbs, leak);
// printf(" Last byte has value %d.", leak[1024 * 1024 * mbs - 1]);
printf(" Press enter to double them.");
getchar();
mbs *= 2;
}
}
Execution is kinda
$ gcc -g -O0 memleak.c -o memleak && ./memleak
32 MiB allocated at address 0x14b92f312010. Press enter to double them.
64 MiB allocated at address 0x14b92b311010. Press enter to double them.
<.....>
32768 MiB allocated at address 0x14a933308010. Press enter to double them.
65536 MiB allocated at address 0x149933307010. Press enter to double them.
131072 MiB allocated at address (nil). Press enter to double them.
262144 MiB allocated at address (nil). Press enter to double them.^C
On a system equipped with 64 GiB of RAM and 2 GiB of swap space. You can find out that mallocs start to fail after allocating more than 65536 MiB.
What puzzles me is that it does not seems the leak is something "real". I mean, on another shell I'm monitoring the process with
$ top -c -p $(pgrep memleak)
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
3102203 pbertoni 30 10 35268 33772 1012 S 0,0 0,1 0:00.01 ./memleak
And only VIRT
is actually following the geometric series. RES
stays constant: with calloc
it's around the first call, ~32 MiB. With malloc
, it goes down to 580
bytes.
The dereference array-like operation throws a SIGSEGV
when crossing the physical threshold of 64 GiB, that's why it's being commented.
Anyway, my point is: why am I taking "virtual" memory when I actually write it (with calloc for example)? I was expecting to take "resident" memory instead. What am I missing?