1

I am testing to initialise a vector to the maximum possible size in runtime. Initially, I thought (size_t)-1 gives me a theoretical maximum while vector::max_size() gives the true runtime maximum. After encountering application failure and asking this question, I have realised that is also another theoretical limit. Further research turns up this:

(maximum size a vector can reach) limited by the largest contiguous chunk of RAM your OS can allocate for you or the return value of vector<>::max_size(), whichever is smaller.

In vectors (and probably containers in general), how do I find this true runtime maximum? I am currently still midway through building up my application (it is a practice project), so I will accept any reasonably elegant and efficient solution that can give an exact value or close but safe approximation.

Community
  • 1
  • 1
thegreatjedi
  • 2,788
  • 4
  • 28
  • 49
  • If you are using a pointer and the `new` keyword then the vector is stored in dynamic memory (heap) which is allocated during program execution. Also, c++ allows you to change the type of data structure, for example you can create a linked list using memory blocks that are allocated during each element's creation and the OS will find blocks of memory that it can use until all RAM is toast – Jake Psimos Dec 12 '15 at 02:58

1 Answers1

1

Running

#include <iostream>
#include <limits>
#include <vector>

int main()
{
    std::cout << std::numeric_limits<std::vector<int>::size_type>::max();
}

Here on Coliru we get

18446744073709551615

As the maximum number of elements the vector can hold. This should be the same on most/all 64 bit systems. So if we use the smallest datatype(char) that means a max size vector would need 18,446,744,073 GB of ram. So in a practical sense this is not achievable at present. This means that the real limit of the size of the vector is limited to the largest contiguous piece of memory that you can grab. This means the true limit is dependent on what other things are running on the computer at the time you check. This also means that the value you get may no longer be valid as another process could consume more memory before you allocate yours.

If you want to get an upper limit to how big of a vector you can have you can find out the available memory using one of the solutions from How to get available memory C++/g++? and dividing it by the size of the thing you are trying to store in the vector.

That should work for all of the standard contiguous containers as they are guaranteed to store their contents in a contiguous chunk. For non contiguous containers the max size relies on the implementation of the container and like the size of the nodes in a list or map or the size of the hash table in the unordered containers.

I do have to question why you want such a large container. Most things can generally be split into smaller chunks where you do not have to deal with hitting the limit on what your system can allocate.

Community
  • 1
  • 1
NathanOliver
  • 171,901
  • 28
  • 288
  • 402
  • If I know exactly how many elements I want to initialise, and it will be very very many, enough for it to be of concern to the OS, but will not change thereafter, should I use vector or deque? – thegreatjedi Dec 12 '15 at 04:50
  • @thegreatjedi I would suggest a vector unless you need to remove from the front in constant time – NathanOliver Dec 12 '15 at 04:59