10

I read here from the accepted answer that a std::deque has the following characteristic

1- Random access - constant O(1)
2- Insertion or removal of elements at the end or beginning - amortized constant O(1)
3- Insertion or removal of elements - linear O(n)

My question is about point 2. How can a deque have an amortized constant insertion at the end or beginning?

I understand that a std::vector has an amortized constant time complexity for insertions at the end. This is because a vector is continguous and is a dynamic array. So when it runs out of memory for a push_back at the end, it would allocate a whole new block of memory, copy existing items from the old location to the new location and then delete items from the old location. This operation I understand is amortized constant. How does this apply to a deque ? How can insertions at the top and bottom of a deque be amortized constant. I was under the impression that it was supposed to be constant O(1). I know that a deque is composed of memory chunks.

Peter Moran
  • 295
  • 3
  • 13
Rajeshwar
  • 11,179
  • 26
  • 86
  • 158
  • 1
    the same principle as `std::vector` applies. The operation is `O(1)`, with cases requiring new allocation when the previously allocated slots are full. – njzk2 Jan 20 '15 at 16:50
  • @njzk2 You **don't** need to copy old values. You simply allocate a new *fixed size* chunk and link it at the beginning/end. That's O(1); not amortized O(1) [assuming allocation of *fixed* amount of memory is O(1)]. Maybe it uses amortized in order to allow implementations using a single big block of contiguous memory, however it wouldn't be very useful in practice, except if the `deque` doesn't change much and you really need contiguous memory to perform some particular operation very efficiently. – Bakuriu Jan 20 '15 at 16:51
  • @Bakuriu: ok. Just new allocations, then. Thank you for clarifying. – njzk2 Jan 20 '15 at 16:54
  • According to this: http://en.cppreference.com/w/cpp/container/deque " Insertion or removal of elements at the end or beginning - **constant** O(1) " – Erel Segal-Halevi May 30 '18 at 07:08

2 Answers2

12

The usual implementation of a deque is basically a vector of pointers to fixed-sized nodes.

Allocating the fixed-size node clearly has constant complexity, so that's pretty easy to handle--you just amortize the cost of allocating a single node across the number of items in that node to get a constant complexity for each.

The vector of pointers part is what's (marginally) more interesting. When we allocate enough of the fixed-size nodes that the vector of pointers is full, we need to increase the size of the vector. Like std::vector, we need to copy its contents to the newly allocated vector, so its growth must follow a geometric (rather than arithmetic) progression. This means that we have more pointers to copy we do the copying less and less frequently, so the total time devoted to copying pointers remains constant.

As a side note: the "vector" part is normally treated as a circular buffer, so if you're using your deque as a queue, constantly adding to one end and removing from the other does not result in re-allocating the vector--it just means moving the head and tail pointers that keep track of which of the pointers are "active" at a given time.

Jerry Coffin
  • 476,176
  • 80
  • 629
  • 1,111
  • 2
    But if I'm reading the standard correctly, 23.3.3.4/3 states that `...Inserting a single element either at the beginning or end of a deque always takes constant time...` where the word "always" seems to preclude even a geometric reallocation. – Mark B Jan 20 '15 at 17:16
  • @MarkB: I think your reading is a reasonable one, but the only way I know of to meet that (and all the other) requirements would be to pre-allocate the vector part as the largest you'll ever support, so you never grow it at all. That's clearly possible, but I think most people would rather it grew, even at the expense of amortized constant complexity. – Jerry Coffin Jan 20 '15 at 17:22
  • I happen to agree with your interpretation but I wanted to make sure I wasn't missing something completely obvious. – Mark B Jan 20 '15 at 17:29
  • @MarkB: No, I don't think so. It's probably worth mentioning, however, that you'd normally expect the number of items in a single node to be large enough that the number of pointers is always fairly small, so 1) even when the time grows, it's unlikely to be very big, and 2) it happens extremely rarely anyway. – Jerry Coffin Jan 20 '15 at 17:33
  • 1
    @MarkB related gcc discussion: https://gcc.gnu.org/bugzilla/show_bug.cgi?id=18080 Basically `"All of the complexity requirements in this clause are stated > solely in terms of the number of operations on the contained objects."`. So it doesn't include the complexity of operating on helper structs. – Dan M. May 01 '18 at 19:59
2

The (profane) answer lies in containers.requirements.general, 23.2.1/2:

All of the complexity requirements in this Clause are stated solely in terms of the number of operations on the contained objects.

Reallocating the array of pointers is hence not covered by the complexity guarantee of the standard and may take arbitrarily long. As mentioned before, it likely adds an amortized constant overhead to each push_front()/push_back() call (or an equivalent modifier) in any "sane" implementation. I would not recommend using deque in RT-critical code though. Typically, in an RT scenario, you don't want to have unbounded queues or stacks (which in C++ by default use deque as the underlying container) anyway, neither memory allocations that could fail, so you will be most likely using a preallocated ring buffer (e.g. Boost's circular_buffer) instead.

Arne Vogel
  • 6,346
  • 2
  • 18
  • 31
  • `std::vector>` with wrapping to make it look like it holds `T` would thus have constant cost operations for almost everything? – Yakk - Adam Nevraumont Jan 20 '15 at 20:40
  • That particular example is invalid because the standard guarantees that a vector is contiguous, in 23.3.7.1/1: "The elements of a vector are stored contiguously, meaning that if v is a vector where T is some type other than bool, then it obeys the identity &v[n] == &v[0] + n for all 0 <= n < v.size()." I guess some other mischief where an implementation is conforming to the letter but not the spirit is possible. – Arne Vogel Jan 21 '15 at 14:37
  • Huh? I was just pointing out that is you wrote a container as I mentioned, it would (under the standard rules) count many operations as non-amortized O(1), including ones like "erase half the container", but not "sort the container". Yes, it wouldn't be a `vector`, but it would be a container. It seemed funny to me. – Yakk - Adam Nevraumont Jan 21 '15 at 14:38
  • Oh, I agree, it does seem funny. On the other hand, real implementations don't conform 100% anyhow, so abuses like this aren't needed. Apart from "ordinary" bugs, there is also e.g. the ref-counting implementation of std::basic_string in libstdc++, which is no longer valid in C++11, but fixing it now would break the ABI. Strictly speaking, all implementations support only a finite number of elements in a container, so the devs could just as well argue that every operation is O(1), just with a **very** large constant factor. – Arne Vogel Jan 21 '15 at 14:48