0

I implemented a a standard quicksort algorithm and tested it over several runs and stored the running times in an array.

int max = 100;
int runs = 1000;
int[] arr = new int[max];
Long[] times = new Long[runs];  

while(n < runs){
        for(int i =0 ; i<max; i++){
        arr[i]=(int)(Math.random()*99);
        //arr[i] = i;  
        }

        Long start = System.nanoTime();
        quicksortsimple(arr,0,max-1);
        Long now = System.nanoTime();

        times[n++] = now-start;
    }

Now what happens is that the outputted "times" array starts with much bigger values and gradually lessens and after 100th index (100 runs of quicksort) it becomes a bit constant but this value is less than one-tenth of the initial 2-3 values. The average value returned for n= 1000 is same over several runes of the program.

Even if i initialize the array as already sorted (commented out line) the same thing happens, albeit taking more average time (as it should).

But this shouldn't have returned such varying running times for the same algorithm on the same array. What does the decreasing pattern indicate, if at all something ?

  • 1
    https://stackoverflow.com/a/11227902/4467208 – Murat Karagöz Sep 21 '17 at 12:21
  • 1
    It indicates your benchmark is flawed: https://stackoverflow.com/questions/504103/how-do-i-write-a-correct-micro-benchmark-in-java – Kayaman Sep 21 '17 at 12:23
  • As long as the numbers returned by `System.nanoTime()` don't get smaller with every iteration (so we don't have to deal with time travelling), there seem to be many possible explanations. I would suggest to only use measurements when they come in with constant values... – bkis Sep 21 '17 at 13:03

1 Answers1

0

(The following explanation is not exact to the last details, but enough to understand what's going on)

In the beginning, your JRE runs the program in interpreted mode (slow), until its Hotspot engine detects that the sorting is worth a speedup and creates a native-compiled translation, which is then used and runs much faster than the interpreted code from the first iterations.

Ralf Kleberhoff
  • 6,990
  • 1
  • 13
  • 7
  • So there should've been a "point" or some discrete set of points where the JRE changed its mode. But, i use a times[] array of 1000 and the results are "continually" decreasing with some ups and downs in between. What does this mean? Could you please elaborate more on what could be happening? This seems quite interesting. – Akarsh Rastogi Sep 21 '17 at 13:18
  • The Hotspot compilation isn't the only factor influencing performance, but often the most prominent one. And the optimizations don't necessarily happen in one step. E.g. besides the `quicksortsimple()` method, there's `Math.random()` called quite often, and there's your outer loop. Each of them is a candidate for independent native-compilation, and finally the Hotspot engine might even inline all of them into one big native code. And there's the processor's cache hitting more often when the background tasks like Hotspot don't consume CPU anymore and so on... – Ralf Kleberhoff Sep 21 '17 at 14:13