4

so i am trying out some sorting algorithms.

private static void quicksort(int[] list, int low, int high){
    int pivot = list[low + (high-low)/2];
    int i = low; 
    int j = high;
    while (i <= j) {
      while (list[i] < pivot) {
        i++;
      }
      while (list[j] > pivot) {
        j--;
      }
      if (i <= j) {
        int temp = list[i];
        list[i] = list[j];
        list[j] = temp;
        i++;
        j--;
      }
    }
    if (low < j){
        quicksort(list, low, j);
    }
    if (i < high){
        quicksort(list, i, high);
    }

}

This code runs on two arrays of integers with x entrys each (lets say 1 billion). The first one is sorted and the second one is a permutation on array 1, where n pairs are randomly chosen and switched.

I choose the middle element as pivot so it should be optimal for the sorted case, right?

I am measuring the time the algorithm takes to sort each array and count how many switches and steps of recursion occur. As expected both of these values are higher for sorting array 2 with the random permutations.

But: the algorithm still takes longer to process the sorted array until i reach a high number of permutations. For n=10000 i get something like 20ms for the unsorted array and 30ms for the sorted one. Why is that?

AliciaBytes
  • 7,300
  • 6
  • 36
  • 47
  • Generally it is faster because there shouldn't be any elements to swap. Have you compared this to the built in `Arrays.sort(int[])`? – Peter Lawrey May 04 '14 at 13:25
  • 4
    It's a tricky business measuring efficiency in java, the speed of the program depends on many factors. – Reinstate Monica May 04 '14 at 13:26
  • Some time, how you benchmark some code can impact how it is optimised. I would run the tests a number of times. – Peter Lawrey May 04 '14 at 13:27
  • check this post: http://stackoverflow.com/questions/2415193/worst-case-for-quicksort-when-can-it-occur – Kostas Kryptos May 04 '14 at 13:38
  • I wonder why so many people are falsely talking about the worst case for quicksort being an already sorted sequence. He takes the middle element as pivot in his code, so sorted isn't the worst case for his code (in fact it should be close to optimal). – AliciaBytes May 04 '14 at 13:53
  • also that question might be better for codereview.stackexchange.com if you reword it a bit. – AliciaBytes May 04 '14 at 13:56
  • 3
    My first guess would be that you are simply wrapping some `System.currentTimeMillis()` or `System.nanoTime()` calls around the invocations, and compute the time difference from that. **IF** you are doing that: Java has a JIT (Just-In-Time Compiler), and measuring the performance of a particular method is very difficult. Also see http://stackoverflow.com/questions/504103/how-do-i-write-a-correct-micro-benchmark-in-java – Marco13 May 04 '14 at 13:57

2 Answers2

3

Best I can say so far is that you should double-check your timing. In general profiling like this should be done as an average over many runs. I made a test class based on your code and got these results:

graph of results confirming common sense

This was done using System.nanoTime() as my profiling tool. Nothing fancy.

edit: Here's a link to the profiling class I wrote. It outputs results in CSV-like format so I could make the graph in a spreadsheet program.

wrongu
  • 560
  • 3
  • 13
  • All right, thanks you very much. I think it simply takes more time the first time i sort an array. Maybe that difference comes from the time java needs to create the object or something. If i run my code on a larger sample size i get the same results as you do. – user3601374 May 04 '14 at 16:23
-2

The worst case scenario for quickSort algorithm is when the array is sorted or almost sorted. That is most likely why it takes you longer time sorting the array which is already sorted.

Jakkra
  • 641
  • 8
  • 25