8

When i run this code in my Devcpp compiler->

#include<bits/stdc++.h>
using namespace std;
int main()
{
    vector<int> vec;
    for(int i=0;i<100000000;i++)
    vec.push_back(i);
}

It works even on run time. But when i run->

#include<bits/stdc++.h>
using namespace std;
int arr[1000000000];
int main()
{
    return 0;
}

It gives me link error.

As long as space is required both arr and vec requires the same space.Then why is it that vec code runs even fine on run time but arr code doesnt even compile.

user3522401
  • 229
  • 2
  • 10

2 Answers2

10

The issue is with the allocation. In the first case, std::vector default allocator uses dynamic allocation, which in principle can allocate as much memory as you want (bounded of course by the OS and the amount of physical memory) whereas in the second case it uses the memory available for static allocation (technically the array has static storage duration), which in your case is smaller than 1000000000 * sizeof int bytes. See this for a nice answer regarding the various types of allocations in a C program (which also applies for C++).

Btw, avoid #include<bits/stdc++.h>, as it is non-standard. Include only the standard headers you need. One more issue: I don't think you get a compile-time error, you probably get a run-time error. In other words, the code compiles just fine, but fails to run.

Community
  • 1
  • 1
vsoftco
  • 55,410
  • 12
  • 139
  • 252
  • @StevenBurnap I am not sure OP is getting a compile error though, it is probably a run time error. Will edit the answer. – vsoftco Feb 01 '16 at 19:22
  • Yeah, what you say makes perfect sense if the OP got a runtime error. – Gort the Robot Feb 01 '16 at 19:22
  • `arr` is not defined on the stack. It is a global object. – R Sahu Feb 01 '16 at 19:22
  • @RSahu Missed that it was defined outside the function, will edit, thanks. – vsoftco Feb 01 '16 at 19:23
  • Oh yeah, if it's a global, there should be no limit as long as the index is an integral type. – Gort the Robot Feb 01 '16 at 19:32
  • 1
    @StevenBurnap Are you sure about that? Isn't the global data segment quite small? In any case, the standard does not mandate anything regarding where/how the allocations are being done. And what do you mean by *as long as the index is an integral type*? The size of the array has to be an integral type. – vsoftco Feb 01 '16 at 19:37
  • What I mean is, that the C++ standard only says that it has to be indexed by an integral type, and doesn't specify a maximum size. The global data segment depends entirely on the platform. As I noted elsewhere, this compiles and runs perfectly well on OSX using clang. – Gort the Robot Feb 01 '16 at 20:33
  • @StevenBurnap Ohh I see, yes I completely agree. – vsoftco Feb 01 '16 at 20:34
  • "you probably get a run-time error. In other words, the code compiles just fine, but fails to run." `arr` will be statically stored in the executable; this will therefore get (much) bigger but it won't overflow the stack. – edmz Feb 01 '16 at 21:25
  • using a thousand separator in 1'000'000'000 would make the number more readable... – Peter VARGA Feb 01 '16 at 22:25
  • @Al Bundy In C++14 though – vsoftco Feb 02 '16 at 01:42
3

It seems that the object

int arr[1000000000];

is too large to fit in the global data of your program for your environment. I don't get a compile time error but I get a link time error in my environment also (cygwin/g++ 4.9.3).

Reducing the size by one tenth work for me. It may work for you also. I don't know how you can determine the maximum size of objects that can fit in global data.

Space available in stack is the smallest in size.
Space available in global data is larger that that.
Space available in heap is the largest of all.

If your object is too large to fit in stack, try to put into global data.
If your object is too large to fit in global data, use heap.

R Sahu
  • 204,454
  • 14
  • 159
  • 270
  • `int arr[1000000000];` is an object? really? – StahlRat Feb 01 '16 at 19:46
  • 1
    @StahlRat, In the C++ object model, every variable is an object. – R Sahu Feb 01 '16 at 19:54
  • 1
    Annex B of the C++ standard names 262144 as a guideline for maximum object size. If I understand correctly, this means that for a non-dynamic `int` array, anything up to `262144 / sizeof(int)` elements should be fine with a quality implementation. But of course, *"these quantities are only guidelines and do not determine compliance"*. – Christian Hackl Feb 01 '16 at 20:14
  • @StahlRat: object not in terms of classes, though. They're rather ADTs (Abstract Data Types). – edmz Feb 01 '16 at 20:39