I was wondering why does the c++ standard require that std::sort
should only take random-access iterators? I don't see the advantage, since both std::sort and std::list::sort have a complexity of N*log(N)
. Restricting std::sort
to random-access iterators (RAI) seems to have made it necessary to write a separate function for lists with the same complexity.
The same applies to partial_sort
, where the non-RAI counter-part for list is simply missing to this day.
Is this design because people used variants of quick_sort
to implement std::sort
historically?
If there is advantage for writing sort algorithms on RAI containers, would it better to make std::sort
more general, and let RAI containers like std::vector
provide specialized v.sort
?