I'm using a server with 128GB memory to do some computation. I need to malloc()
a 2D float array of size 56120 * 56120. An example code is as follows:
int main(int argc, char const *argv[])
{
float *ls;
int num = 56120,i,j;
ls = (float *)malloc((num * num)*sizeof(float));
if(ls == NULL){
cout << "malloc failed !!!" << endl;
while(1);
}
cout << "malloc succeeded ~~~" << endl;
return 0;
}
The code compiles successfully but when I run it, it says "malloc failed !!!"
. As I calculated, it only takes about 11GB of memory to hold the whole array. Before I started the code, I checked the server and there was 110GB of free memory available. Why does the error happen?
I also found that if I reduce num
to, say 40000, then the malloc will succeed.
Does this mean that there is a limit on the maximum memory that can be allocated by malloc()
?
Moreover, if I change the way of allocation, directly declaring a 2D float array of such size, as follows:
int main(int argc, char const *argv[])
{
int num = 56120,i,j;
float ls[3149454400];
if(ls == NULL){
cout << "malloc failed !!!" << endl;
while(1);
}
cout << "malloc succeeded ~~~" << endl;
for(i = num - 10 ; i < num; i ++){
for( j = num - 10; j < num ; j++){
ls[i*num + j] = 1;
}
}
for(i = num - 11 ; i < num; i ++){
for( j = num - 11; j < num ; j++){
cout << ls[i*num + j] << endl;
}
}
return 0;
}
then I compile and run it. I get a "Segmentation fault"
.
How can I solve this?