Let's say I have to iterate over a potentially very large vector of numbers and copy the even and odd elements into new, separate vectors. (The source vector may have any proportion of evens to odds; it could be all evens, all odds, or somewhere in-between.)
For simplicity, push_back
is often used for this sort of thing:
for (std::size_t Index; Index < Source.size(); Index++)
{
if (Source[Index] % 2) Odds.push_back(Source[Index]);
else Evens.push_back(Source[Index]);
}
However, I'm worried that this will be inefficient and be harmful if it's used as part of the implementation for something like a sorting algorithm, where performance is paramount. QuickSort, for example, involves separating elements much like this.
You could use reserve()
to allocate memory before-hand so only one allocation is needed, but then you have to iterate over the entire source vector twice - once to count how many elements will need to be sorted out, and once more for the actual copying.
You could, of course, allocate the same amount of space as the source vector's size, since neither new vector will need to hold more than that, but that seems somewhat wasteful.
Is there a better method that I'm missing? Is push_back()
usually trusted to manage this sort of thing for the programmer, or can it become burdensome for sensitive algorithms?