I devised an NlgN sorting algorithm I called "tournament sort" some time ago, which finds output items in order (i.e. it starts by finding the first item, then it finds the second item, etc.). I'm not sure it's entirely practical, since the book-keeping overhead exceeds that of quicksort, merge sort, etc. but in benchmarking with 1,000,000 random items the comparison count actually came in below a standard library quicksort implementation (though I'm not sure how it would do against newer ones).
For purposes of my algorithm, the "score" of each item is the number of items that it's known to be better than. Initially each item has a score of 0. When two items are compared, the better item adds to its score the score of the other item, plus one. To run the algorithm, declare all items to be "eligible", and as long as more than one eligible item remains, compare the two eligible items with the lowest score, and make the loser "ineligible". Once all but one item has been declared ineligible, output the one remaining item and then declare eligible all the items which that item had "beaten".
The priority queue required to compare the two lowest-scoring items poses a somewhat annoying level of bookkeeping overhead, but if one were sorting things that were expensive to compare, the algorithm may be interesting.