1

I have seen segmentation fault sometimes during the initialization of an array with huge size.

For ex:

#include<iostream>
#include<limits>
using namespace std;

int main()
{
    string h;
    cin >> h;
    int size=h.size();
    cout << size << endl;
    int arr[size][size];
    cout << arr[0][0]<<endl;
    arr[0][0]=1;
    cout << arr[0][0]<<endl;

return 0;
}

When the user input is a small string lets say "sample" the program is working fine.

When the user input is a big string where the size is for ex. >1500.Segmentation is seen during the initialization of the array int arr[size][size];

What can be the issue?Is there any problem in initializating the array like the one above.

Andrea
  • 6,032
  • 2
  • 28
  • 55
starkk92
  • 5,754
  • 9
  • 43
  • 59

3 Answers3

1

I think you are out of memory with those initializations, causing a stack overflow. I recommend to allocate it on the heap or by using a std:vector. See here: Segmentation fault on large array sizes

Community
  • 1
  • 1
Andrea
  • 6,032
  • 2
  • 28
  • 55
0

I think an array's size must always be a compile-time constant in C++ i.e. the value of your 'size' variable must be known at compile time.

If you want dynamic storage, use std::vector

Westbrook
  • 63
  • 6
0

MSDN states that the default stack size on Windows is 1 MB - in case of 1500 elements in each dimension your array would take up 1500 * 1500 * 4 bytes = 9000000 bytes = 8.58 megabytes, not sure about Linux (this states it to be 8 MB) - I guess it depends on the compiler and distributive. So either:

1) If you know that there is a limit for the string length increase the stack size accordingly with the /STACK linker flag on Windows or like posted in this answer on Linux

2) Allocate the array on heap - if you don't want to mess around with memory allocations std::vector or std::unique_ptr can be used as a container

Community
  • 1
  • 1
Rudolfs Bundulis
  • 11,636
  • 6
  • 33
  • 71