5

I continue experimenting with C. I have this program that allows you to decide how much RAM you want to eat.

char * eatRAM()
{
    unsigned long long toEat;
    unsigned long long i = 0;
    float input;
    char * pMemory = NULL;
    int megaByte = 1048576;

    puts("How much RAM do you want to eat? (in Mega Bytes)");
    puts("NOTE: If you want to eat more RAM than you have available\nin your system, the program will crash");
    printf("\n>> MB: ");
    scanf("%f", &input);

    toEat = (unsigned long long)(input * megaByte);
    pMemory = malloc(toEat);

    printf("\n\nEating in total: %llu Bytes\n", toEat);
    puts("Check your task manager!\n");

    if(pMemory != NULL)
    {
        printf("\n\nEating in total: %llu Bytes\n", toEat);
        puts("Check your task manager!\n");

        for(i; i < toEat; i++)
        {
            pMemory[i] = 'x';
        }
    }
    else
    {
        puts("\nSeems like that amount of memory couldn't be allocated :( \n");
    }
    return pMemory;
}

UPDATED QUESTION:

The thing is that... if I enter for example 1024MB it works, I can see in the task manager it is using 1GB of RAM. Even if I enter 1500MB it works..

But if I enter 2048MB it says

Seems like that amount of memory couldn't be allocated :(

or even if I enter 1756MB

Remember I'm new to C, maybe I'm omitting something important related to how OS allows me to access memory, what could it be?

Hans Passant
  • 922,412
  • 146
  • 1,693
  • 2,536
Juan Bonnett
  • 1,823
  • 1
  • 15
  • 33
  • malloc probably fails and returns null, check that – Pedro Sassen Veiga Nov 29 '15 at 06:57
  • Crashes because you try to access the memory although the allocation failed, you must check `pMemory != NULL` before trying to use it. Also, you should not cast `malloc()`, in general never cast from `void *` to any other pointer. – Iharob Al Asimi Nov 29 '15 at 06:57
  • How much RAM does your computer have? – Iharob Al Asimi Nov 29 '15 at 07:06
  • Ok, I've handled the error as you suggested. I still want to know, why couldn't it allocate that memory even if I have more than 5GB available when I run the program? – Juan Bonnett Nov 29 '15 at 07:06
  • 1
    Why is `input` a `float`? and you don't check for the return value of `scanf()` in case if fails your program would also behave in a weird way. And probably try to allocate much more memory than you think. – Iharob Al Asimi Nov 29 '15 at 07:09
  • Ahhhh, yeah, I originally declared input as float because I wanted the user to be able to test for example 10.5 MB, but... maybe it's useless seeing that the actual operation is made with long integers... Anyway, this program is for experimentation and personal use... I'm just learning C – Juan Bonnett Nov 29 '15 at 07:11
  • I can't think of a reason, but it might be a MS Windows limitation. Have you tried it on a Linux system? – Iharob Al Asimi Nov 29 '15 at 07:17
  • Nope... Gonna install a VM and experiment there, who knows, you might have the reason with the MS Windows Limitation. But then, how could videogames or heavy software be able to run anyway? – Juan Bonnett Nov 29 '15 at 07:19
  • I wonder if the Windows heap has a per-block size limitation that could cause this? It would be interesting to run this in a loop (i.e. could you allocate multiple 1G blocks before failure). – keithmo Nov 29 '15 at 07:49
  • If you are going to test the same thing in a Linux environment, you may encounter [a nasty surprise](http://stackoverflow.com/q/19750796/2564301)! (Or "pleasant", depending on whether you think this is a good or a bad thing.) – Jongware Nov 29 '15 at 11:22
  • Maybe this helps you out: http://stackoverflow.com/questions/5686459/what-is-the-maximum-memory-available-to-a-c-application-on-32-bit-windows I also ran into the above problem when using a memory stream, which was running out of memory after having more than 2gb allocated on a 64bit system. – Revils Nov 29 '15 at 07:22
  • @RadLexus in this case the OP actually **uses** the allocated memory, so the situation won't be different from Windows to other OSes – phuclv Dec 31 '16 at 07:49

3 Answers3

6

A 32-bit process on Windows has a 2 gigabyte address space available by default. The bottom half of the full pow(2, 32) address space, the top 2 GB is used by the operating system. Since just about nobody uses a 32-bit OS anymore, you can get 4 GB when you link your program with /LARGEADDRESSAWARE.

That 2 GB VM space needs to be shared by code and data. Your program typically loads at 0x00400000, any operating system DLLs you use (like kernel32.dll and ntdll.dll) have high load addresses (beyond 0x7F000000). And at least the startup thread's stack and the default process heap are created before your program starts running, their addresses are generally unpredictable.

Your program will be subjected to shrink-wrapped viral attacks in most any OS install, you'll have DLLs injected that provide "services" like anti-malware and cloud storage. The load address of those DLLs are unpredictable. Also any DLLs that you linked with yourself and are implicitly loaded when your program starts. Few programmers pay attention to their preferred base address and leave it at the default, 0x1000000. You can see these DLLs from the debugger's Modules window. Such DLLs often have their own CRT and tend to create their own heaps.

Allocations you make yourself, particularly very large ones that won't come from the low-fragmentation heap, need to find address space in the holes that are left between existing code and data allocations. If you get 1500 MB then your VM is pretty clean. In general you'll get into trouble beyond 650 MB, quickly getting less when the program has been running for a while and fragmented the VM space. Allocation failures are almost always caused because the OS can't find a big enough hole, not because you don't have enough VM left. The sum of the holes can be considerably larger than your failed allocation request.

These details are rapidly becoming a folk tale, there are very few remaining reasons to still target x86. Target x64 and address space fragmentation won't be a problem for the next 20 years, very hard to fragment 8 terabytes of VM. With lots of headroom to grow beyond that.

So it should be obvious why you can't get 2048 MB, you can't get it all. Get further insight from SysInternals' VMMap utility, it shows you how the VM is carved up. And Mark Russinovich' blog post and book give lots of background.

mirh
  • 514
  • 8
  • 14
Hans Passant
  • 922,412
  • 146
  • 1,693
  • 2,536
  • Thank you for taking the time to give such a well supported answer! Gonna keep on researching and practicing! - Cheers! – Juan Bonnett Nov 29 '15 at 14:47
4

It is an OS limit rather then a C limit.

To address more than 4Gb system wide you need to be running a 64 bit OS and for a single process to address more than 4Gb it must be built as a 64 bit app. Win32 has a 2Gb per process memory limit. 5Gb of physical RAM is largely irrelevant since the memory is virtualised.

Quite apart from the theoretical limits of 32 and 64 bit systems and applications, an OS may still impose limits. Different versions and editions (Home, Pro, Server etc.) of Windows for example impose specific limits for commercial reasons.

A specific answer in your case would require information about your system, toolchain and build options applied. If you are using Windows and VC++ you need to consider the /LARGEADDRESAWARE option; it is not enabled by default in the 32 bit compiler, but Win32 has a 2Gb default limit in any case unless physical address extension is enabled.

I believe that a 32 bit process running on Win64 can address the full 4Gb 32 bit address space, but you will certainly need to build with /LARGEADDRESAWARE in that case. Even then not quite all that space will be available to the heap, and any single allocation must be contiguous, so may be limited by previous allocations and heap fragmentation.

Clifford
  • 88,407
  • 13
  • 85
  • 165
  • Thank you for taking the time to give such a well supported answer! Gonna keep on researching and practicing! - Cheers! – Juan Bonnett Nov 29 '15 at 14:47
0

Allocation will never work if the amount of remaining free memory is less than the amount you're trying to allocate.

Also, right after this line:

pMemory = (char *) malloc(toEat);

Add the following:

if (!pMemory){
  printf("Can't allocate memory\n");
  return NULL;
}

That way, instead of receiving "segmentation fault" related messages, you'll see one "Can't allocate memory" message instead and your function will return NULL.

Make sure you do similar value checking in functions that call your eatRam function or you'll receive "segmentation fault" messages. and also, use debuggers like gdb.

Mike -- No longer here
  • 2,064
  • 1
  • 15
  • 37
  • 1
    Thank you mike for replying. Actually I have more than 5GB of AVAILABLE RAM at the time of running the tests with 2048MB, what do you think? Also, I updated some minutes ago my code so it handled the return of malloc properly. Thanks again! – Juan Bonnett Nov 29 '15 at 07:15
  • 2
    @JuanBonnett please note that using `malloc` means the memory you are using is one giant block of code, even if you have 2-3+GB of free memory it might not have (lets say) 1GB of a free block. Also I think I read somewhere that because of this reason, windows has some "Block limitations" so applications will not try to allocate memory from other apps or something like that. – TomTsagk Nov 29 '15 at 07:48