2

The piece of code:

#include <array>
#include <iostream>

int function()
{
  const int N=10000000;
  std::array<double, N> array{0.0};
  std::cout<<"N="<<N<<std::endl;
  return 0;
}

int main(int, char **)
{
  function();
  exit(0);
}

When I launch the program, I see:

Segmentation fault (core dumped)

The program works only atN<10000000. I understand that the reason is the overflow of the stack. But if I were to declare the array static:

static std::array<double, N>{0.0};

everything works well up to N=1000000000. I was surprised.

As far as I can understand, the static std::array / std::vector inside a function is allocated in global memory (as if it was a static global array), not on the stack. That is why I can declare a static array inside a function that is much bigger than an ordinary array local to the function. Is it true?

StoryTeller - Unslander Monica
  • 165,132
  • 21
  • 377
  • 458
And
  • 310
  • 1
  • 12

1 Answers1

0

In word, yes. :)

For completeness, it is worth mentioning that you can also allocate the array on the heap, by using the new operator.

You could also choose the store the data in a std::vector. Performance/flexibility tradeoffs there are a little bit different to std::array, but the issue of stack size doesn't arise even for vectors that live on the stack (since the underlying storage is dynamically allocated).

NPE
  • 486,780
  • 108
  • 951
  • 1,012
  • I checked using local std::vector vec(N, 0.0) without static instead of std::array. It works, but at N=10^9 my computer was frozen while with std::array it worked normally. – And Jul 17 '19 at 10:14
  • If to write std::array array; without the initialization of the "array" with 0.0, the program works without segmentation fault even at N>10^7. Why? – And Jul 17 '19 at 10:17