0

I am collecting data on how different sorting algorithms perform with different array sizes with Java. One of the methods I'm using is randomizing and sorting an integer array within a loop and collecting elapsed time on each iteration with something that looks like this:

long startNano = System.nanoTime();
long startMillis = System.currentTimeMillis();
SortingAlgorithms.sort1(array);
long endNano = System.nanoTime();
long endMillis = System.currentTimeMillis();
this.elapsedNano = endNano - startNano;
this.elapsedMillis = endMillis - startMillis;

I am not sure if this is the best way to collect time but I have noticed that with relatively small array sizes (500 up to 100K), the sorting takes sometimes up to 5-6 times longer in the first iteration, no matter which sorting algorithm I use. Why does this happen? Does the algorithm somehow get 'optimized' the first time it's called? I made sure that I'm not sorting a sorted array on the next iterations, the array gets randomized each time.

Miles
  • 103
  • 12

0 Answers0