5

This code produces a segmentation fault during the array declaration. I'm confused as to why this happens. I intentionally selected 2000000000 as a value because it is below 2^31 and can fit into an integer variable.

int main()
{

    int  nums_size = 2000000000;

    int nums[nums_size];

    int i;
    for(i = 0; i < nums_size; i++) {
        nums[i] = i;
    }


    return 0;

}
Artelius
  • 48,337
  • 13
  • 89
  • 105
aoeu
  • 1,128
  • 2
  • 13
  • 22
  • Possible duplicate of [Segmentation fault on large array sizes](https://stackoverflow.com/questions/1847789/segmentation-fault-on-large-array-sizes) – phuclv Sep 02 '19 at 01:27
  • [Explanation of why allocating a large array results in segmentation fault in C](https://stackoverflow.com/q/30864358/995714), [Why do I get a segfault in C from declaring a large array on the stack?](https://stackoverflow.com/q/3144135/995714) – phuclv Sep 02 '19 at 01:28

6 Answers6

21

Well, for one thing, that's two billion integers. If you have a 32-bit address space and int has a size of four bytes on your platform (typical for a 32-bit platform), you can't store that many integers, period.

Even still, you only have so much space available to you on the stack, which is where automatic variables are located.

If you need a really large array, you should dyncamically allocate it using malloc() (and if you do so, be sure to free it using free() when you are done with it!).

James McNellis
  • 348,265
  • 75
  • 913
  • 977
  • Even if that weren't the case, 2000000000*4 = 8,000,000,000 bytes on a 32bit platform. That's almost 2^33, which is more than available memory. – Chris K Jun 16 '10 at 00:10
  • @Chris: Yeah--I didn't actually count the zeros until after I posted. That's a lot of integers! – James McNellis Jun 16 '10 at 00:13
  • It's also typically possible to allocate very large arrays with static storage duration. – caf Jun 16 '10 at 01:40
  • @caf: True, though I'm always hesitant to suggest that--I've had to maintain and rework too much legacy code that wasn't designed to be reentrant but needed to be used in multithreaded software. :-P You are right, though: there are some circumstances under which a static array is the right solution. – James McNellis Jun 16 '10 at 02:07
  • Yes - all true, although I would suggest that allocating an 8GB working array in most cases tends to make a function non-reentrant for all practical purposes anyway ;) – caf Jun 16 '10 at 02:21
4
int  nums_size = 2000000000;

int nums[nums_size];

Does not mean 2000000000 bytes of ints, it means 2000000000 elements of type int, which on a 32-bit platform means that you are consuming almost 8GB of memory - this is impossible.

Chris K
  • 11,996
  • 7
  • 37
  • 65
3

You are allocating a giant array on the stack. Virtually no C/C++ compiler will handle that correctly.

You might be able to get away with moving it into the globals (which will allocate the space statically by mapping memory in the executable at compile time), or by switching to a malloc'd array.

Of course, that's still a LOT of memory to ask for at one go, but at least the methods I'm mentioning will avoid an immediate segfault.

Walter Mundt
  • 24,753
  • 5
  • 53
  • 61
  • 1
    The compiler will handle it correctly (if it was inside the 2^32 memory size), but the operating system won't allow the stack to get that big. – Yann Ramin Jun 16 '10 at 00:51
  • not just C/C++ pretty much any language that does stack based allocation (ie almost all of them) – Spudd86 Jun 16 '10 at 02:46
3

Local variables are allocated on the stack. There is a fixed amount stack space (typically 1MB–8MB, varies with OS) provided to the application. The general rule is to use malloc() to allocate large amounts of data.

5ound
  • 1,179
  • 6
  • 9
1

The answer to your question is simple: stackoverflow. No, no, not the site, but the actual process of "overflowing the stack". You don't have enough stack to store that array. As simple as that. Doing this on memory-constrained systems is pure madness. Also see this question.

Community
  • 1
  • 1
INS
  • 10,594
  • 7
  • 58
  • 89
0

This version runs fine on my PC:

const int nums_size = 2000000000;
int nums[nums_size];

int main()
{
    int i;
    for(i = 0; i < nums_size; i++) {
        nums[i] = i;
    }

    return 0;
}

(Well, let's be honest. It starts fine, but soon goes into swap.)

sigfpe
  • 7,996
  • 2
  • 27
  • 48
  • And I'm guessing your version runs on a 64bit platform. From his 2^31 comment, he's definitely not running a 64bit OS. – Chris K Jun 16 '10 at 01:05
  • @Chris the 2^31 comment doesn't tell you whether you're running 32-bit or 64-bit. I think gcc defaults to 32-bit ints on 64-bit platforms. – sigfpe Jun 16 '10 at 02:18