I have a program which needs to initiate an unordered_map<int,int>
with 100M+ entries. Is it true tha as long as the machine has large enough memory, we can declare a local variable as large as we want? Or there is some (tunable) upper bound on the size of the local variable even when the machine has huge memory like 128GB? I know that unordered_map
has large memory overhead.
I get this concern because I met such a problem in running JAVA program. I know that JAVA has JVM while C++ does not require it.
I'm developing the code on a Linux machine with 128GB memory, but the potential clients may use it on a Linux machine with 8GB memory.
The code looks like:
int func() {
unordered_map<int,int> mp;
for (int i=0; i<INT_MAX; i++) mp[i] = i-1;
return mp.size();
}
This post discusses heap memory, but I'm still a bit confused here.