Many C++ standard algorithms, like std::sort()
, assume that the comparator comp
is a strict weak ordering and it cannot be assumed that comp
has any another (nice) properties. But many times comp
does have more properties than just being a strict weak ordering. In particular, many times comp
is a strict total order (so in particular, exactly one of the following is always true for all a
and b
: comp(a, b)
, comp(b, a)
, or a = b
). For example, the usual operator<()
on float-points, integers, and std::string
s are all strict total orders.
By limiting itself to only assuming that comp
is a strict weak ordering, does the C++ standard library limit itself to using less than optimal algorithms? Said differently, if the C++ standard algorithms assumed that comparators were strict total orderings instead of just strict weak orderings, then would some standard algorithms be faster than what's currently implemented?
Update: To be more exact about what "strict total ordering" means, let's suppose that the STL assumed that comp
(operating on objects of type T
) had all of the nice order-theoretic properties that operator<()
on int
s has. (So if you want, we may also assume that there is also an operator==()
defined on objects of type T
that works as you'd expect; this assumption is optional and you can make different assumptions if you'd like.) Could any STL algorithms be made faster?
More generally, if the STL made "nicer" assumptions about comp
(i.e. assumed more than that comp
is a mere strict weak ordering) then could any STL algorithms be made faster?