I am trying to reproduce a old R Code I had into C++ in order to gain some speed. This code require the use of preloaded data (6 matrices of dimension 13689 x 126) (each of them weight about 28 MB in the .txt file). It did not cause any problem with R to have this loaded in the memory.
I built a C++ code (I am new to C++) where I try to 'pre-load' these data (is there anyway to pre-load it in the header file btw?).
int i, j;
const int length_grid1 = 13689;
const int length_grid_pl = 126;
ifstream in;
//double M1_BETA[length_grid1][length_grid_pl]; // either this, or the other one
std::array<std::array<int, length_grid_pl>, length_grid1> M1_BETA;
in.open("preloaded_object/M1_BETA.txt");
for (i = 0; i < length_grid1; i++) {
for (j = 0; j < length_grid_pl; j++) {
in >> M1_BETA[i][j];
}
}
in.close();
Both versions, using double or array do not work: I can compile the file (using Intel Composer), but when I launch the .exe, it instantly crashes. This happens with only one of the matrix loaded... And I must load 6 of them.
I don t think it is a problem coming from my code, when I reduce the data dimension it executes itself without any problem.
Notice that I cannot run the program afterwards if everything is not loaded. And reloading it part by part would reduce the pace too much I think (and I switched to C++ in order to gain computation speed..).
Is the program crashing because 28MB is too much? Seems weird because R does not have any problem for example...
Otherwise, can I preload in a way such that it takes less place in memory? I need to have easy access ton any row of the data afterwards (cell by cell is not so important, what I need is really to access specific rows really quickly).
Or is there even another way in C++ to store data, usable at any time?
Thanks in advance.