I have an assignment to compare the execution times for sorting large arrays of integers with mergesort and quicksort. The implementations of the algorithms are copied from a book. They both sort integers in ascending order. I am then asked to state how much faster quicksort is on average. However, with my (albeit small) sample size of tests, I found that mergesort was quicker by a small %, which leads me to believe i'm testing improperly.
I have 2 programs, one for mergesort and one for quicksort. Their main() code is the same (shown below). I create an array of random size between 0 and some chosen upper limit. Then I fill the array with random integers. Then I sort the array and print it. I use the currentTimeMillis() to check the runtime of the programs. I do tests of 10 runs for quicksort and mergesort, add the total time and divide by 10 to get the average running time for the sorts. I did not find that quicksort was faster than mergesort on average. I've also tried using a fixed array size, but the results still do not hint at quicksort being faster. What would be a -proper- way to test?
public static void main(String[] args){
/* create an array of some size 0 to 100000 with
random integers and measure the time it takes
to sort the array in ascending order (where sort() is either quicksort
or mergesort.)
*/
long a = System.currentTimeMillis();
Random rnd = new Random(System.currentTimeMillis());
Integer[] array = new Integer[rnd.nextInt(100000)];
for(int i = 0; i < array.length; i++){
array[i] = rnd.nextInt();
}
sort(array);
System.out.println(Arrays.toString(array));
long b = System.currentTimeMillis() - a;
System.out.println("Program runtime: " + b + "ms");
}
}
Expected result: Quicksort average running time should be some % faster
in comparison to mergesort, for large random integer arrays.
Actual result: Mergesort was slightly faster in most tests
(10K elements, 100K elements, 1000K elements).