I have some code, in Java, to time the average of 10 tries of an insertion sort. It sorts the same array but up to 100 words, 200 words etc. up until the full lenght of on array. The code below is for 100 and 200 words respectively but it all looks the same with just the variables changed.
However every next average time seems to get shorter and shorter. Eg. 9658233ns for 100 words, 4273459ns for 200 words, 3716726ns for 300 words, 2965091ns for 400 words and 2842528ns for 459 words. However if I inverse the order, sorting through whole array first and array of 100 words last, it takes the longest amount of time to sort through the full array and shortest amount of time to sort through the array of 100 words.
Would using a different way to time the insertion sort lead me to more consistent results?
long i1AverageTime = 0L;
for (int i = 0; i < 10; i++) {
long start = System.nanoTime();
insertionSort(myArray, 100);
long end = System.nanoTime();
long timeTaken = end - start;
i1AverageTime += timeTaken;
}
i1AverageTime /= 10;
long i2AverageTime = 0L;
for (int i = 0; i < 10; i++) {
long start = System.nanoTime();
insertionSort(myArray, 200);
long end = System.nanoTime();
long timeTaken = end - start;
i2AverageTime += timeTaken;
}
i2AverageTime /= 10;