2

I have the following program:

#include <stdio.h>
#include <sys/resource.h>

int main()
{

    // Anything over ~8MB fails
    short int big[4000000];
    printf("%lu\n", sizeof(big));

}

ulimit shows that I have unlimited memory available to the program. However, if I try and allocate more memory I get an error:

short int big[6000000];
$ gcc main.c -o main.out && ./main.out
Segmentation fault: 11

Is there anything I need to change within the C program so that I can allocate, for example a 1GB array?

Marco Bonelli
  • 63,369
  • 21
  • 118
  • 128
  • [Explanation of why allocating a large array results in segmentation fault in C](https://stackoverflow.com/q/30864358/995714), [Segmentation fault on large array sizes](https://stackoverflow.com/q/1847789/995714), [Why do I get a segfault in C from declaring a large array on the stack?](https://stackoverflow.com/q/3144135/995714), [why does this large array declaration produce a segmentation fault?](https://stackoverflow.com/q/3049934/995714) – phuclv Sep 02 '19 at 01:23

4 Answers4

5

You're statically allocating an array on the stack, this means that the compiler will write code to reserve that space, and when your main() is called, it will try to move the stack pointer way out of the available mapped stack area for your program. Touching the stack would then cause a segmentation fault, which is what you see.

You could increase the stack size, but it's not that simple nor portable, and in general allocating such a large array on the stack is bad practice and should be avoided. To handle such a big array, you should dynamically allocate it, using for example malloc().

Here's a working example:

#include <stdio.h>
#include <stdlib.h>

int main(void)
{
    short int *big;

    big = malloc(6000000 * sizeof(short int));
    if (big == NULL) {
        fputs("Failed to allocate memory!\n", stderr);
        return 1;
    } 

    // Do whatever...

    free(big);
    return 0;
}

Also, remember that you cannot use sizeof() in this case since big is a dynamically allocated array (sizeof(big) would yield the size of the pointer, not the real size of the array). This is because sizeof() is a compile-time operator and can only help you if the size is known at compile time. In this case, it is not, since the space is allocated at runtime.

If you want to know the size of that array, you can simply calculate it with a multiplication:

short int *big;
const size_t big_size = 6000000ULL * sizeof(short int);

printf("Size: %zu\n", big_size);

big = malloc(big_size);
// ...
Marco Bonelli
  • 63,369
  • 21
  • 118
  • 128
1

You can't statistically declare any array with that big size. It causes the program stack to overflow. What you need is to declare memory dynamically. And here linked list can serve your purpose.

Shakibuz_Zaman
  • 260
  • 1
  • 14
0
#include <stdio.h>
#include <sys/resource.h>

int main()
{

    // allocate the memory you need
    short int* big = (short int*)malloc(6000000 * sizeof(short));

    if(big)
    {
      printf("alloc all good\n");


      // to free memory
      free(big);
    }
    else
    {
      printf("alloc failed\n");

    }
}
robthebloke
  • 9,331
  • 9
  • 12
0

You should use dynamic memory allocation instead of statically defining an array for allocating 1GB of data.This link will help you learn the difference

shubham
  • 182
  • 1
  • 1
  • 8