13

I am playing around to understand how much memory can be allocated. Initially I thought that the maximum memory which can be allocated is equal to Physical memory (RAM). I checked my RAM on Ubuntu 12.04 by running the command as shown below:

~$ free -b
             total       used       free     shared    buffers     cached
Mem:    3170848768 2526740480  644108288          0  265547776 1360060416
-/+ buffers/cache:  901132288 2269716480
Swap:   2428497920          0 2428497920

As shown above,total physical memory is 3Gig (3170848768 bytes) out of which only 644108288 bytes is free, so I assumed I can at max allocate only this much memory. I tested it by writing the small program with only two lines below:

char * p1 = new char[644108290] ;
delete p1;

Since code ran perfectly , it means it allocated memory successfully. Also I tried to allocate the memory greater than the available physical free memory still it did not throw any error. Then per question

maximum memory which malloc can allocate

I thought it must be using the virtual memory.So I tested the code for free swap memory and it also worked.

char * p1 = new char[2428497920] ;
delete p1;

The I tried to allocate the free swap plus free RAM bytes of memory

char * p1 = new char[3072606208] ;
delete p1;

But this time code failed throwing the bad_alloc exception.Why the code didn't work this time.

Now I allocated the memory at compile time in a new program as shown below:

char p[3072606208] ;
char p2[4072606208] ;
char p3[5072606208];
cout<<"Size of array p = " <<sizeof p <<endl;
cout<<"Size of array p2 = " <<sizeof p2<<endl;
cout<<"Size of array p2 = " <<sizeof p3;

The out put shows

Size of array p = 3072606208
Size of array p1 = 4072606208
Size of array p2 = 777638912

Could you please help me understand what is happening here. Why did it allowed the memory to be allocated at the compile time but not at dynamically. When allocated compile time how come p and p1 were able to allocate memory greater than swap plus free RAM memory. Where as p2 failed. How exactly is this working. Is this some undefined behaviour or os specific behaviour. Thanks for your help. I am using Ubuntu 12.04 and gcc 4.6.3.

Community
  • 1
  • 1
Forever Learner
  • 1,465
  • 4
  • 15
  • 30
  • 2
    In your test program with the stack arrays, please try actually _using_ the arrays other than with `sizeof` (write something to the end of them). – Mat Oct 25 '12 at 18:19
  • A computer can extend its RAM by using *virtual* memory. A section of memory can be swapped to a disk or other device when not in use. – Thomas Matthews Oct 25 '12 at 20:03

5 Answers5

8

Memory pages aren't actually mapped to your program until you use them. All malloc does is reserve a range of the virtual address space. No physical RAM is mapped to those virtual pages until you try to read or write them.

Even when you allocate global or stack ("automatic") memory, there's no mapping of physical pages until you touch them.

Finally, sizeof() is evaluated at compile time, when the compiler has no idea what the OS will do later. So it will just tell you the expected size of the object.

You'll find that things will behave very differently if you try to memset the memory to 0 in each of your cases. Also, you might want to try calloc, which zeroes its memory.

Mike DeSimone
  • 41,631
  • 10
  • 72
  • 96
2

Interesting.... one thing to note: when you write

char p[1000];

you allocate (well, reserve) 100 bytes on the stack.

When you write

char* p = malloc(100);

you allocate 100 bytes on the heap. Big difference. Now I don't know why the stack allocations are working - unless the value between the [] is being read as an int by the compiler and is thus wrapping around to allocate a much smaller block.

Most OSs don't allocate physical memory anyway, they give you pages from a virtual address space which remains unused (and therefore unallocated) until you use them, then the memory-manager unit of the CPU will nip in to give you the memory you asked for. Try writing to those bytes you allocated and see what happens.

Also, on windows at least, when you allocate a block of memory, you can only reserve the largest contiguous block the OS has available - so as the memory gets fragmented by repeated allocations, the largest side block you can malloc reduces. I don't know if Linux has this problem too.

gbjbaanb
  • 51,617
  • 12
  • 104
  • 148
  • 1
    `char p[1000];` does not necessarily allocate memory on the stack. If the declaration is made at file scope it most certainly does not allocate on the stack -- or on the heap. – David Hammen Oct 25 '12 at 18:42
  • I believe on linux you will get the memory. It automatically breaks up any memory request larger than 128kb into separate chunks which are appropriately mapped into the virtual address space. So even when you start using your memory, not all of it will necessarily be allocated. – Dunes Oct 25 '12 at 18:50
2

There's a huge difference between these two programs:

program1.cpp

int main () {
   char p1[3072606208];
   char p2[4072606208];
   char p3[5072606208];

   std::cout << "Size of array p1 = " << sizeof(p1) << std::endl;
   std::cout << "Size of array p2 = " << sizeof(p2) << std::endl;
   std::cout << "Size of array p3 = " << sizeof(p3) << std::endl;
}

program2.cpp:

char p1[3072606208];
char p2[4072606208];
char p3[5072606208];

int main () {

   std::cout << "Size of array p1 = " << sizeof(p1) << std::endl;
   std::cout << "Size of array p2 = " << sizeof(p2) << std::endl;
   std::cout << "Size of array p3 = " << sizeof(p3) << std::endl;
}

The first allocates memory on the stack; it's going to get a segmentation fault due to stack overflow. The second doesn't do much at all. That memory doesn't quite exist yet. It's in the form of data segments that aren't touched. Let's modify the second program so that the data are touched:

char p1[3072606208];
char p2[4072606208];
char p3[5072606208];

int main () {

   p1[3072606207] = 0;
   p2[3072606207] = 0;
   p3[3072606207] = 0;

   std::cout << "Size of array p1 = " << sizeof(p1) << std::endl;
   std::cout << "Size of array p2 = " << sizeof(p2) << std::endl;
   std::cout << "Size of array p3 = " << sizeof(p3) << std::endl;
}

This doesn't allocate memory for p1, p2, or p3 on the heap or the stack. That memory lives in data segments. It's a part of the application itself. There's one big problem with this: On my machine, this version won't even link.

David Hammen
  • 32,454
  • 9
  • 60
  • 108
  • 1
    For me the first program ran successfully without any error. For second program I am getting the compile time error saying size of variable p1 and p2 is too large – Forever Learner Oct 25 '12 at 19:24
1

The first thing to note is that in modern computers is that processes do not get direct access to RAM (at the application level). Rather the OS will provide each process with a "virtual address space". The OS intercepts calls to access virtual memory reserves real memory as and when needed.

So when malloc or new says it's found enough memory for you, it just means that its found enough memory for you in the virtual address space. You can check this by running the following program with the memset line and with it commented out. (careful, this program uses a busy loop).

#include <iostream>
#include <new>
#include <string.h>

using namespace std;

int main(int argc, char** argv) {

    size_t bytes = 0x7FFFFFFF;
    size_t len = sizeof(char) * bytes;
    cout << "len = " << len << endl;

    char* arr = new char[len];
    cout << "done new char[len]" << endl;

    memset(arr, 0, len); // set all values in array to 0
    cout << "done setting values" << endl;

    while(1) {
        // stops program exiting immediately
        // press Ctrl-C to exit
    }

    return 0;
}

When memset is part of the program you will notice the memory used by your computer jumps massively, and without it you should barely notice any difference if any. When memset it called is accessed all the elements of the array, forcing the OS to make the space available in physical memory. Since the argument for new is a size_t (see here) then the maximum argument you can call it with is 2^32-1, though this isn't guaranteed to succeed (it certainly doesn't on my machine).

As for your stack allocations: David Hammem's answer says it better than I could. I am surprised you were able to compile those programs. Using the same setup as you (Ubuntu 12.04 and gcc 4.6) I get compile errors like:

test.cpp: In function ‘int main(int, char**)’:

test.cpp:14:6: error: size of variable ‘arr’ is too large

Dunes
  • 37,291
  • 7
  • 81
  • 97
0

try the following code:

bool bExit = false;
unsigned int64 iAlloc = 0;

do{
   char *test = NULL;
   try{
        test = new char[1]();
        iAlloc++;
   }catch(bad_alloc){
   bExit = true;}
}while(!bExit);

char chBytes[130] = {0};
sprintf(&chBytes, "%d", iAlloc);
printf(&chBytes);

In one run don't open other programms, in the other run load a few large files in an application which use memory mapped files.

This may help you to understand.

marscode
  • 373
  • 1
  • 4
  • 10