I'm new to C++, and I'm studying 'compressive sensing' so I need working with huge matrices, and MATLAB is actually slow so I programmed my algorithm with C++.
The thing is that I store big arrays (around 100Mb-1Gb). They are 20 arrays approx. and it works fine with 30 Gb of memory however when the process needs more than 40Gb it just stops. I think it's a memory problem, I tested it on Linux and Windows (OS 64 bits - compilers 64 bits MinGW - 200Gb Ram - intel Xeon) is there any limitation?.
size_t tm=n*m*l;
double *x=new double[tm];
I use around 20 arrays like this one. n,m ~= 1000 and L ~= 30 those are typically sizes.
Thank you