1

I have a program which needs to initiate an unordered_map<int,int> with 100M+ entries. Is it true tha as long as the machine has large enough memory, we can declare a local variable as large as we want? Or there is some (tunable) upper bound on the size of the local variable even when the machine has huge memory like 128GB? I know that unordered_map has large memory overhead.

I get this concern because I met such a problem in running JAVA program. I know that JAVA has JVM while C++ does not require it.

I'm developing the code on a Linux machine with 128GB memory, but the potential clients may use it on a Linux machine with 8GB memory.

The code looks like:

int func() {
  unordered_map<int,int> mp;
  for (int i=0; i<INT_MAX; i++) mp[i] = i-1;
  return mp.size();
}

This post discusses heap memory, but I'm still a bit confused here.

user3813057
  • 891
  • 3
  • 13
  • 31
  • 2
    What do you mean by “local variable”? The contents of the map aren’t stored on the stack or anything. 100 million `` entries is also 800ish MB, so…. – Ry- Mar 10 '18 at 17:24
  • Whats the map even used for? In your current example, the map is pretty pointless. – tkausl Mar 10 '18 at 17:31
  • 1
    @Ryan, 800MB would be for a vector. An unordered_map takes significantly more, typical overhead is about 32 bytes per element so it would be more like 4GB. – Eelke Mar 10 '18 at 19:26
  • 1
    If in your actual use case you also have a continuous series of integer keys you will be way better of using your keys as indexes into a std::vector. – Eelke Mar 10 '18 at 19:29
  • @Eelke: 32 *bytes* per element? That’s surprising. Thanks for the correction. – Ry- Mar 10 '18 at 19:36

2 Answers2

1

In my experience as C++ programmer, you can't declare local arrays as big as you want. Local variables are stored in the 'Call Stack'. But in your case, as you're using STL, there should be no problems, becouse with STL data structures only a few attributes are stored in the 'Call Stack', and the elements of your data structure are stored in the heap.

Eduardo Pascual Aseff
  • 1,149
  • 2
  • 13
  • 26
  • Can you please provide a link or reference showing that STL data mainly uses heap? Moreover, if an STL container consumes space in heap memory, does it mean that the container can claim as much space as it needs, as long as it is smaller than the whole heap memory? – user3813057 Mar 11 '18 at 15:51
  • 1
    You may check at Nicolai M. Josuttis's book "The C++ Standard Library", in chapter 15, section 3 – Eduardo Pascual Aseff Mar 11 '18 at 16:19
  • I have a 2nd edition of the book. It is I/O stream related content in Chap. 15, Sec. 3. Is this what you are using? Or which version of the book are you referring to? thanks – user3813057 Mar 12 '18 at 17:22
  • 1
    I have the first edition, the chapter is about the default allocator – Eduardo Pascual Aseff Mar 14 '18 at 03:27
0

c++ can use as much memory as your OS is willing to provide to a single process. For 32-bit processes this is no more than 4GB, on 64-bit processes it'll be the total available memory and swap space on the machine (there are limits but I don't imagine current machines will run into them).

Alan Birtles
  • 32,622
  • 4
  • 31
  • 60