This question is similar to (link) but not exactly what I am looking for. Here is a selection sort I wrote below, it's in Java.
public void sort() {
int smallIndex;
for(int i = 0; i < arr.length; i++) {
smallIndex = i;
for (int j = i + 1; j < arr.length; j++) {
if (arr[j] < arr[smallIndex]) {
smallIndex = j;
}
}
if(i < smallIndex)
swap(i, smallIndex);
}
}
Here is my swap method.
public void swap(int a, int b) {
int temp = arr[a];
arr[a] = arr[b];
arr[b] = temp;
}
I am filling an array with 50,000 random elements between 0 and 999. The method sorts it fine. Each execution is about 1400 milliseconds (1300 - 1500).
Here is a selection sort from a book I was looking at.
public void sortBook() {
for(int i = 0; i < arr.length; i++) {
int minPos = minimumPosition(i);
swap(minPos, i);
}
}
Here is the minimumPosition method.
public int minimumPosition(int from) {
int minPos = from;
for(int i = from + 1; i < arr.length; i++) {
if(arr[i] < arr[minPos]) {
minPos = i;
}
}
return minPos;
}
The swap method is the one I am using from above. I am consistently getting around 700 millisecond execution time for this with 50,000 elements with a range of 0 - 999.
My question is why does sort() take almost twice as long to execute as sortBook(). I have stepped through them both on smaller arrays and they seem to take the same amount of steps to do the same thing, sort() only swaps sometimes, the sortBook() swaps every time. To my novice eye, they look to be doing almost the exact same thing.
I don't understand why sort() takes roughly double the time sortBook() does. Anybody have any insights as to why this is the case?