1

I am trying to push_back instances of an object to a vector (e.g. A a; vectorA.push_back(a)) but the code crashes whenever I try to push_back more once the size of the vector is more than 16777216 (16*1024*1024) (the capacity of the vector is 16777216 as well, this means vector will reallocate memory).

Anyone knows how I should deal with the issue? I checked the memory used is about 320MB.

harper
  • 13,345
  • 8
  • 56
  • 105
user1558064
  • 867
  • 3
  • 12
  • 28
  • 4
    What is the result of vector::max_size? http://www.cplusplus.com/reference/vector/vector/max_size/ – Neil Kirk Jul 25 '13 at 15:56
  • Could it be that you are hitting an internal limit of the vector object? 16777216 = 2^24 – Marwie Jul 25 '13 at 15:57
  • 5
    If it's not exceeding the max_size, it's possible you don't have enough contigious free memory to perform the reallocation. – Neil Kirk Jul 25 '13 at 15:57
  • What is the need of max_size when it can fail even before that? – Saksham Jul 25 '13 at 15:58
  • 1
    Because it may not fail before that. There is more than one possible cause of failure here. But max_size is the upper limit at which it will definately fail. – Neil Kirk Jul 25 '13 at 16:00
  • 2
    Is the program just crashing, or is there an exception not being caught? Specifically, have you tried catching a `std::bad_alloc`? – zindorsky Jul 25 '13 at 16:02
  • max_size is way more than 16777216. thanks. Is there anyway to fix the problem if it is the case : no enough contigious free memory to perform the reallocation. – user1558064 Jul 25 '13 at 16:02
  • 4
    You could try another data structure should as deque, which does not require the entire data to be contigious. You could also try reserving the size of the vector to the maximum if you know it. This could work as, when you push back, both the new and old vector arrays need to be kept in memory for a time. This is avoided when you reserve on a fresh vector. – Neil Kirk Jul 25 '13 at 16:08
  • Please clarify what is meant by crashing. Do you get a specific error code or message? – n. m. could be an AI Jul 25 '13 at 16:24

2 Answers2

0

Since std::vector is wrapper around basic C array, it has the same limitations as the normal array, which described here: Is there a max array length limit in C++? Basically i agree with comments above that this is contiguous memory limit. To work around you can switch from using std::vector to some other class, which does not use arrays ( like map, list or deque depending on your needs ). Another solution would be using several vectors.

Community
  • 1
  • 1
Bogolt
  • 520
  • 1
  • 3
  • 9
0

Try calling reserve() before push all the data into it. This pre-allocated some memory, so avoids quite so many re-allocations. If you know you will have more than 16777216 elements, then a call to reserve(n) where n is a number >16777216 and closer to your final size, if you know it.

cdmh
  • 3,294
  • 2
  • 26
  • 41