0
void initialize(int arr[], int size[], int n)
{
    int i;
    for(i = 1; i <= n; i++) {
        arr[i] = i;
        size[i] = 1;
    }

}

class hell
{
    public:
    int edges;
    int vertices;
    pair<int , pair<int,int>> p[100000];
    int disjoint_set[10000];
    int cc_size[10000]; // size of connected components

    hell(int e, int v)
    {
      edges = e;
      vertices = v;

      initialize(disjoint_set, cc_size, vertices);
    }
};

In the following class when I create an object using vertices=100000 and edges=100000, the code stops working. But when we remove the initialize(disjoint_set, cc_size, vertices) it starts working. I don't have any clue to such behavior. Please guide me.

Engineero
  • 12,340
  • 5
  • 53
  • 75
Arijit Das
  • 13
  • 4
  • 2
    What do you mean it "stops working"? – Carcigenicate Apr 24 '18 at 13:57
  • 4
    100000 is far bigger tan 10000. – tkausl Apr 24 '18 at 13:57
  • Are there any errors and how long have you tried to wait for it? There are a lot of operations so it can easily take ~30sec+ (depending on hardware/...) – LenglBoy Apr 24 '18 at 13:59
  • 4
    Array indices start at 0 and not 1, you have UB, change your loop bounds. Furthermore, if create a local `hell` variable, you are going to have huge arrays on the stack, this may not be a good idea (especially the 100k array of pair of pair... ). – Holt Apr 24 '18 at 14:00
  • 1
    Does it just hang? Throw an error? What about if you give it smaller values for `verticies` and `edges`? – Engineero Apr 24 '18 at 14:00
  • 1
    'hell' is a pretty descriptive class name in this case. – bipll Apr 24 '18 at 14:01
  • 2
    Why not simply use a vector and allocate the size you actually needs instead of having a huge underlying array that you partially use? – Holt Apr 24 '18 at 14:04
  • Possible duplicate of [Why does a large local array crash my program?](https://stackoverflow.com/questions/22945647/why-does-a-large-local-array-crash-my-program) **edit** [Segmentation fault on large array sizes](https://stackoverflow.com/questions/1847789/segmentation-fault-on-large-array-sizes) is probably the more canonical dupe – underscore_d Apr 24 '18 at 14:07

1 Answers1

3

Arrays in C++ are zero indexed, which means that valid index is in [0..n[ range. Your code does it wrong:

 for(i = 1; i <= n; i++) {
    arr[i] = i;
    size[i] = 1;
}

it should be:

 for(i = 0; i < n; i++) {
    arr[i] = i + 1;
    size[i] = 1 + 1;
}

or better use algo std::iota() and std::fill():

std::iota( arr, arr + n, 1 );
std::fill( size, size + n, 1 );

and you better use std::vector, which will adjust its size properly, rather than have huge array.

Slava
  • 43,454
  • 1
  • 47
  • 90
  • 2
    You don't want `std::iota` for `size`, you want `std::fill`. – Holt Apr 24 '18 at 14:03
  • @Holt thanks, did not see that there is `1` not `i`, they are close. – Slava Apr 24 '18 at 14:26
  • Thanks a lot !!! I have one more question, when I initialized the vertices = 10000 and edges = 100000, the code works but assigns hell.edge = 1 which seems weird. Is their any reason behind it ? – Arijit Das Apr 25 '18 at 01:51