What is meant by to "sort in place"?
-
They are not comparable: *sort in place* is a characteristic of a sort algorithm, *natural ordering* refers to the order by which you want your collection sorted. So you could sort in place using natural ordering. – assylias May 16 '13 at 10:55
-
Meaning of *in-place* in general semantics [here](https://stackoverflow.com/q/2779797/465053). – RBT Feb 15 '18 at 00:20
5 Answers
The idea of an in-place algorithm isn't unique to sorting, but sorting is probably the most important case, or at least the most well-known. The idea is about space efficiency - using the minimum amount of RAM, hard disk or other storage that you can get away with. This was especially relevant going back a few decades, when hardware was much more limited.
The idea is to produce an output in the same memory space that contains the input by successively transforming that data until the output is produced. This avoids the need to use twice the storage - one area for the input and an equal-sized area for the output.
Sorting is a fairly obvious case for this because sorting can be done by repeatedly exchanging items - sorting only re-arranges items. Exchanges aren't the only approach - the Insertion Sort, for example, uses a slightly different approach which is equivalent to doing a run of exchanges but faster.
Another example is matrix transposition - again, this can be implemented by exchanging items. Adding two very large numbers can also be done in-place (the result replacing one of the inputs) by starting at the least significant digit and propogating carries upwards.
Getting back to sorting, the advantages to re-arranging "in place" get even more obvious when you think of stacks of punched cards - it's preferable to avoid copying punched cards just to sort them.
Some algorithms for sorting allow this style of in-place operation whereas others don't.
However, all algorithms require some additional storage for working variables. If the goal is simply to produce the output by successively modifying the input, it's fairly easy to define algorithms that do that by reserving a huge chunk of memory, using that to produce some auxiliary data structure, then using that to guide those modifications. You're still producing the output by transforming the input "in place", but you're defeating the whole point of the exercise - you're not being space-efficient.
For that reason, the normal definition of an in-place definition requires that you achieve some standard of space efficiency. It's absolutely not acceptable to use extra space proportional to the input (that is, O(n) extra space) and still call your algorithm "in-place".
The Wikipedia page on in-place algorithms currently claims that an in-place algorithm can only use a constant amount - O(1) - of extra space.
In computer science, an in-place algorithm (or in Latin in situ) is an algorithm which transforms input using a data structure with a small, constant amount of extra storage space.
There are some technicalities specified in the In Computational Complexity section, but the conclusion is still that e.g. Quicksort requires O(log n) space (true) and therefore is not in-place (which I believe is false).
O(log n) is much smaller than O(n) - for example the base 2 log of 16,777,216 is 24.
Quicksort and heapsort are both normally considered in-place, and heapsort can be implemented with O(1) extra space (I was mistaken about this earlier). Mergesort is more difficult to implement in-place, but the out-of-place version is very cache-friendly - I suspect real-world implementations accept the O(n) space overhead - RAM is cheap but memory bandwidth is a major bottleneck, so trading memory for cache-efficiency and speed is often a good deal.
[EDIT When I wrote the above, I assume I was thinking of in-place merge-sorting of an array. In-place merge-sorting of a linked list is very simple. The key difference is in the merge algorithm - doing a merge of two linked lists with no copying or reallocation is easy, doing the same with two sub-arrays of a larger array (and without O(n) auxiliary storage) AFAIK isn't.]
Quicksort is also cache-efficient, even in-place, but can be disqualified as an in-place algorithm by appealing to its worst-case behaviour. There is a degenerate case (in a non-randomized version, typically when the input is already sorted) where the run-time is O(n^2) rather than the expected O(n log n). In this case the extra space requirement is also increased to O(n). However, for large datasets and with some basic precautions (mainly randomized pivot selection) this worst-case behaviour becomes absurdly unlikely.
My personal view is that O(log n) extra space is acceptable for in-place algorithms - it's not cheating as it doesn't defeat the original point of working in-place.
However, my opinion is of course just my opinion.
One extra note - sometimes, people will call a function in-place simply because it has a single parameter for both the input and the output. It doesn't necessarily follow that the function was space efficient, that the result was produced by transforming the input, or even that the parameter still references the same area of memory. This usage isn't correct (or so the prescriptivists will claim), though it's common enough that it's best to be aware but not get stressed about it.

- 117
- 1
- 6
In-place sorting means sorting without any extra space requirement. According to wiki , it says
an in-place algorithm is an algorithm which transforms input using a data structure with a small, constant amount of extra storage space.
Quicksort is one example of In-Place Sorting.

- 302,674
- 57
- 556
- 614

- 9,493
- 18
- 66
- 102
-
However, some people use (or mis-use) the term to mean a function that gives you the result in the same container. This is effectively in-place in terms of what the caller sees, but the memory requirements are more. – May 16 '13 at 11:00
-
According to Wikipedia: Quicksort operates in-place on the data to be sorted as it only ever swaps two elements. However, most implementations require O(log n) space to keep track of the recursive function calls as part of the divide and conquer strategy; so Quicksort is not an in-place algorithm. – cmbaxter May 16 '13 at 11:03
-
1@cmbaxter - O(log n) space is a lot better than O(n) space, and that's the key point. A small amount isn't necessarily a constant amount. Non-inplace algorithms produce a new copy while the old one is still kept. Algorithms that require O(n) space are disqualified as inplace because it's considered cheating (defeating-the-point) to use O(n) space just to figure out how to do the swaps. – May 16 '13 at 11:14
-
@cmbaxter - Though now I re-read, I see the "constant space" claim comes from Wikipedia via this answer - I guess I disagree with Wikipedia on this. For the record, I don't know of an O(n log n) comparison-based sorting algorithm that *doesn't* have an O(log n) requirement for extra space - heap-sort has the same stack issue that quicksort has, for example, for the same reason - recursion. – May 16 '13 at 11:22
-
1Gotta love Wikipedia. The In Place Sort article states that Quicksort is not in place because its not using a constant size data structure. But then if you read the Quicksort article it says it can be accomished with an in place sort. No wonder there is confusion on things like this :) – cmbaxter May 16 '13 at 11:26
-
1@Steve314, Merge sort on a linked list can use O(1) space and O(n log n) time as long as you consider the links part of the original objects. – Mark Ransom May 16 '13 at 18:07
-
1
-
@pjs - it looks like you're right - the `heapify` algorithm is tail recursive, which fooled me when I checked a bit earlier. Time to edit my answer... – May 16 '13 at 18:36
-
1@Steve314, it has nothing to do with stacks. Merging works by moving items from one list to another, and you can do that with no additional space just by changing the link. The only thing you need to track is where one list ends and another begins, and you can do that with a single variable telling you the length of the lists (since all but the last will have the same length). – Mark Ransom May 16 '13 at 18:37
-
@Mark - It's been a [bad day](http://stackoverflow.com/questions/16582677/why-can-i-define-a-function-in-another-function/16582975#comment23845968_16582975) for that kind of thing. I'm stopping now before I cause any more damage. – May 16 '13 at 18:49
-
@Steve314, we all have those days - don't worry about it. Funny your other comment mentions bubble sort, the first time I had to sort a linked list that's what I used. It finally dawned on me one day that merge sort would have been perfect, and I was appropriately embarrassed. – Mark Ransom May 16 '13 at 19:02
-
I don't think these terms are closely related:
Sort in place means to sort an existing list by modifying the element order directly within the list. The opposite is leaving the original list as is and create a new list with the elements in order.
Natural ordering is a term that describes how complete objects can somehow be ordered. You can for instance say that 0 is lower that 1 (natural ordering for integers) or that A is before B in alphabetical order (natural ordering for strings). You can hardly say though that Bob is greater or lower than Alice in general as it heavily depends on specific attributes (alphabetically by name, by age, by income, ...). Therefore there is no natural ordering for people.

- 12,455
- 7
- 45
- 79
-
I do agree with your comment but wikipedia says "an in-place algorithm is an algorithm which transforms input using a data structure with a small, constant amount of extra storage space." – spandey Apr 02 '19 at 04:45
I'm not sure these concepts are similar enough to compare as suggested. Yes, they both involve sorting, but one is about a sort ordering that is human understandable (natural) and the other defines an algorithm for efficient sorting in terms of memory by overwriting into the existing structure instead of using an additional data structure (like a bubble sort)

- 35,283
- 4
- 86
- 95
it can be done by using swap function , instead of making a whole new structure , we implement that algorithm without even knowing it's name :D

- 2,268
- 2
- 20
- 36