0

I'm using the following:

vector < vector < unsigned int > > paths;

But it seems I only can add 647 278 rows there. I have 10 numbers on each row.

I call this on each iteration:

paths.resize(paths.size() + 1, vector < unsigned int >(10));

Is there a better way to do it than call it every iteration? And do I always have to give the column count since it doesn't change?

sleepless_in_seattle
  • 2,132
  • 4
  • 24
  • 35

2 Answers2

1

As pyCthon points out, this other question explains that size_t is the right type to use for the size here, as it is guaranteed to be big enough to allow up to the maximum for your architecture.

Secondly, the .resize() method doesn't really need calling each time. Instead construct the new vector and .push_back(newvec) to add it to the vector. The internal allocator will allocate space as it sees fit, and is generally the best option there - it will generally require O(log n) reallocations, which are important here; if the vector has to reallocate because it needs more space, you could end up continuously re-copying the entire array to new blocks of memory.

Even better, if you could work out the total size of the arrays at the start, do so. That way there will only be 1 allocation at the start if you call .reserve(size), and then use .push_back() for each element, as it will allocate the whole block at the beginning.

If you want to know the maximum number of elements a vector can take on your architecture, call vector::max_size(). Example from cplusplus.com:

// comparing size, capacity and max_size
#include <iostream>
#include <vector>
using namespace std;

int main ()
{
  vector<int> myvector;
  cout << "max_size: " << myvector.max_size() << "\n";
  return 0;
}

Running this on ideone.com quickly gets me a max size of 1,073,741,823, and if the vector is a vector< vector< unsigned int > > instead, I get 357,913,941.

Community
  • 1
  • 1
Phil H
  • 19,928
  • 7
  • 68
  • 105
1

not sure but in this std::vector description it points out that the vector storage is in one continues location in memory. That means that the vector is stored in one big block in your memory.

Did you tried to split this into different vectors (the big array)?

Robᵩ
  • 163,533
  • 20
  • 239
  • 308
Lukas
  • 52
  • 2
  • 2
    Since this is vectors within a vector it will be one large block and lots of little ones. – Mark Ransom Nov 26 '12 at 22:40
  • i checked it, if he is sure to have everytime 10 elements in the inner vector its better to use an array within the outer vector. The outer vector will store a lot of unneeded information cause of using vectors. – Lukas Nov 26 '12 at 23:22