I implemented a a standard quicksort algorithm and tested it over several runs and stored the running times in an array.
int max = 100;
int runs = 1000;
int[] arr = new int[max];
Long[] times = new Long[runs];
while(n < runs){
for(int i =0 ; i<max; i++){
arr[i]=(int)(Math.random()*99);
//arr[i] = i;
}
Long start = System.nanoTime();
quicksortsimple(arr,0,max-1);
Long now = System.nanoTime();
times[n++] = now-start;
}
Now what happens is that the outputted "times" array starts with much bigger values and gradually lessens and after 100th index (100 runs of quicksort) it becomes a bit constant but this value is less than one-tenth of the initial 2-3 values. The average value returned for n= 1000 is same over several runes of the program.
Even if i initialize the array as already sorted (commented out line) the same thing happens, albeit taking more average time (as it should).
But this shouldn't have returned such varying running times for the same algorithm on the same array. What does the decreasing pattern indicate, if at all something ?