I can think of three operations in C++ that can be described in some sense as having 'constant' complexity. I've seen some debate(*) over what this means, and it seems to me that we could just say "all these operations are constant, but some are more constant than others" :-)
(Edit 2: If you already think you know the answer, please read some of the debate at this question before rushing in too soon: What data structure, exactly, are deques in C++? Many people, with quite high scores, are arguing over what 'constant' means. My question isn't as obvious as you might think!)
(Edit: I don't need a primer in what 'complexity' means. I want a clear answer, perhaps with quotes from the C++ standard, that tells us exactly what is supposed to be constant. Processor ticks, real-world time, or something else? On other threads, some people have argued that time is totally irrelevant to what is required by the C++ standard.)
Do these complexity guarantees in the standard apply to the runtime of the operation? Or do they simply specify the (maximum) number of copies/moves that might happen of the the contained objects? And what exactly does 'amortized' mean?
1. Given a (non-empty) vector<int> v
, the following is clearly constant in runtime:
swap(v.front(), v.back());
(although a pedant might point out that it depends on whether the data is in the cache or swapped out or whatever!).
2. Given a list<int> l
, doing a push_back
is straightforward. Exactly one new item is allocated and a few pointers in the linked list are shuffled. Each push_front involves one allocation, always of the same amount of memory, so this is clearly fairly 'constant'. But, of course, the time to do the allocation could be quite variable. The memory-management might have to take a lot of time to find some suitable free memory.
3. But doing a push_back
on a vector<int>
is even more unpredictable. Most of the time, it will be very quick, but every now and then it will have to reallocate space for all the data and copy every element to a new location. So it's less predictable in terms of runtime than a single list::push_front
, but it's still referred to as constant (amortized). On average, adding a lot of data to a vector will take a complexity that is independent of the amount added, and this is why it's called 'amortized constant' time. (Am I right?)
And finally, I asked about int
to avoid the complexities of having another type. For example, a vector< vector<int> >
might be a little more complicated to reason about because each element of the vector (of vectors) might have a different size and, for example, swapping two elements is not quite as constant as in case 1. above. But ideally, we could answer for all vector<T>
, not just T=int
.
(*) For an example of the debates, see the comments to these answers: What data structure, exactly, are deques in C++?