I have a question regarding vectors used in c++. I know unlike array there is no limit on vectors. I have a graph with 6 million vertices and I am using vector of class. When I am trying to insert nodes into vector it is failing by saying bad memory allocation. where as it is working perfectly over 2 million nodes. I know bad allocation means it s failing due to pointers I am using in my code but to me this does not seems the case. My question is it possible that it is failing due to the large size of graph as limit on vector is increased. If it is is there any way we can increase that Limit.
-
5Use a 64-bit system with a 64-bit compiler and your limits will practically disappear. Of course if you try to use more memory than what is installed in your system, things will slow down dramatically. – Mark Ransom Aug 31 '15 at 17:07
-
1Could you show us code that you're using to add items to the vector, and especially the part where the exception gets thrown? – Xirema Aug 31 '15 at 17:08
-
1Check `std::vector::max_size()` – Alex Díaz Aug 31 '15 at 17:10
-
When a *vector* needs to allocate more memory it doubles its size. This means that at some points it consumes 3 times as much memory as it needs to store its data. – Galik Aug 31 '15 at 17:15
-
@Galik, there are limits to this behavior. It is not doubling, it is a function of current size. – SergeyA Aug 31 '15 at 17:16
-
@Galik Usually 1.5 is chosen rather than 2: http://stackoverflow.com/a/5232342/212870 – Alan Stokes Aug 31 '15 at 17:17
-
1@Galik the actual growth in size depends on the library implementation. I remember seeing a study that found doubling was not the optimal strategy, although you can still find it in places. – Mark Ransom Aug 31 '15 at 17:17
-
@SergeyA Ah, good to know. I should have said *may consume* because it is implementation defined. But you can't escape it consuming a minimum off *twice* as much memory as it needs while it expands. – Galik Aug 31 '15 at 17:19
-
2@user3022426, "I know unlike array there is no limit on vectors", since vector is contiguous sequence of elements and it is hard to allocate big amount contiguously I would say that your knowledge is incorrect. – StahlRat Aug 31 '15 at 17:20
-
2@AlanStokes here's one explanation of why 1.5x is better than 2x: http://stackoverflow.com/a/1100426/5987 – Mark Ransom Aug 31 '15 at 17:22
-
Can you show us what your Node class looks like ? If it also has std::vector members, then you need to know that there is a big overhead associated with std::vector (needs to keep several pointers, and does dynamic memory allocation) – BrunoLevy Aug 31 '15 at 19:11
2 Answers
First of all you should verify how much memory a single element requires. What is the size of one vertex/node? (You can verify that by using the sizeof
operator). Consider that if the answer is, say, 50 bytes, you need 50 bytes times 6 million vertices = 300 MBytes.
Then, consider the next problem: in a vector the memory must be contiguous. This means your program will ask the OS to give it a contiguous chunk of 300 MBytes, and there's no guarantee this chunk is available even if the available memory is more than 300 MB. You might have to split your data, or to choose another, non-contiguous container. RAM fragmentation is impossible to control, which means if you run your program and it works, maybe you run it again and it doesn't work (or vice versa).
Another possible approach is to resize the vector manually, instead of letting it choose its new size automatically. The vector tries to anticipate some future growth, so if it has to grow it will try to allocate more capacity than is needed. This extra capacity might be the difference between having enough memory and not having it. You can use std::vector::reserve
for this, though I think the exact behaviour is implementation dependent - it might still decide to reserve more than the amount you have requested.
One more option you have is to optimize the data types you are using. For example, if inside your vertex class you are using 32-bit integers while you only need 16 bits, you might use int16_t
which would take half the space. See the full list of fixed size variables at CPP Reference.

- 5,271
- 9
- 40
- 61
There is std::vector::max_size
that you can use to see the maximum number of elements the the vector you declared can potentially hold.
Return maximum size
Returns the maximum number of elements that the vector can hold.
This is the maximum potential size the container can reach due to known system or library implementation limitations, but the container is by no means guaranteed to be able to reach that size: it can still fail to allocate storage at any point before that size is reached.

- 6,810
- 1
- 26
- 45