3

I have a puzzle for the following code:

#include <iostream>
#include <vector>

int main() {
    const size_t LARGE = 12000000000;
    std::vector<long> vec(LARGE);
    vec[0] = -1;
    vec[LARGE - 1] = 1;
    std::cout << vec.size() << std::endl;
    return 0;
}

The output is:

12000000000

My RAM is 16G, but why I can allocate a vector of size 12x8G? I thought that vector has to be allocated as a continuous memory in heap.

If the allocation is in virtual memory, my virtual memory setting for my Windows 11 is:

Total paging file size for all drivers: 9216MB

So the total usable memory is 16G+9G=25G, which is far less than 12x8G=96G.

When I changed LARGE to 16G, the program throws bad_alloc error. What determines the usable memory in my PC?

Daniel Langr
  • 22,196
  • 3
  • 50
  • 93
CPW
  • 175
  • 8
  • 5
    Because you have a lot of virtual memory, and it is allocating memory dynamically. – Pepijn Kramer Aug 22 '23 at 11:52
  • But why 16x8G fails? – CPW Aug 22 '23 at 11:53
  • 3
    Side notes : use `static constexpr std::size_t LARGE=12000000000;` Containers use std::size_t for their sizes. And stop using `using namespace std;` – Pepijn Kramer Aug 22 '23 at 11:53
  • 2
    What if you use `std::vector vec(LARGE, 1);`? – Daniel Langr Aug 22 '23 at 11:59
  • 1
    16x8G is over 128GB, which likely exceeds your operating system's virtual addressing capabilities. – Sam Varshavchik Aug 22 '23 at 12:00
  • 3
    The 16G limit is probably coming from your OS, not from C++. – Mark Ransom Aug 22 '23 at 12:00
  • 1
    If the allocation is in virtual memory, and the allocation exceeds the amount that virtual memory can provide, the program could throw bad_alloc. The reality is even worse than that though, because operating systems often *over promise* memory, which may cause a fault when trying to access the memory that was allocated that the OS cannot actually provide. Not C++'s blame; blame the OS. – Eljay Aug 22 '23 at 12:01
  • My virtual memory setting is: "Total paging file size for all drivers: 9216MB" for my Windows 11. So the total usable memory is 16G+9G=25G, which is far less than 12x8G=96G. – CPW Aug 22 '23 at 12:04
  • If you're using virtual memory at all, even the physical memory needs paging storage behind it. You can't add the physical and the paging, your limit will be solely the paging. – Mark Ransom Aug 22 '23 at 12:10
  • _"My RAM is 16G, but why I can allocate a vector of size 12x8G?"_ Because the program can use something as `calloc` and then just set those 2 values, which causes only 2 pages from virtual adddress space to be mapped to RAM. `calloc` does not necessarily need to map all allocated pages by itself. See this excellent explanation: [Why malloc+memset is slower than calloc?](https://stackoverflow.com/q/2688466/580083) That was why I asked about an experiment with setting all elements to 1 instead of 0. – Daniel Langr Aug 22 '23 at 12:19
  • TL:DR; the memory is not actually allocated. You can verify this with `memset(vec.data(), 0xff, vec.size() * sizeof(vec[0]));` which will fail because you don't have enough memory available. – Aykhan Hagverdili Aug 22 '23 at 12:24
  • @AykhanHagverdili This is not a duplicate of those 2 questions. OP is on Windows, while those questions are both related to Linux. – Daniel Langr Aug 22 '23 at 12:28
  • @DanielLangr https://stackoverflow.com/q/11779042 This one is not Linux specific, the top answer simply uses Linux as an example of the kind of OS which supports that. So it very much applies to Windows. – Aykhan Hagverdili Aug 22 '23 at 12:34
  • @AykhanHagverdili But it is related only to the second part of the question. The primary question was why the 12G case allocates without any problem. – Daniel Langr Aug 22 '23 at 12:37
  • @DanielLangr The top answer there explains that it's because opportunistic memory allocation. Does that not answer why the allocation succeeds? – Aykhan Hagverdili Aug 22 '23 at 12:42
  • @AykhanHagverdili I don't think so. This is a different case, since `vector` is required to set the values of all elements in OP's case in the constructor. So it cannot be translated just to a `malloc` call. – Daniel Langr Aug 22 '23 at 12:48
  • @DanielLangr I assume there's something similar for allocating zeroed out memory which is running into the same opportunistic allocation issue here – Aykhan Hagverdili Aug 22 '23 at 14:10
  • 1
    @DanielLangr It's also possible that the allocation completely gets optimized away https://godbolt.org/z/xz57ze6fh – Aykhan Hagverdili Aug 22 '23 at 14:19
  • A bug on the library provider side. This also results from underspecification of standard; IMHO a std defect is the root cause. Definig a safer allocator class and using it as 2nd template parameter to `std::vector` is not that hard anyway. – Red.Wave Aug 22 '23 at 17:40
  • @AykhanHagverdili Yes, it's possible by the AS-IF rule. But it doesn't seem that this is the OP's case, where the large allocation throws `bad_alloc`. There are too many "possible" and "might" in this discussion. We even don't know which compiler OP uses to be able to try to reproduce this issue. – Daniel Langr Aug 23 '23 at 06:43

0 Answers0