I've been working on a 3-way merge sort algorithm, and the professor wants me to average how much time it takes to merge sort an array of 1000, 10000, 100000, 1000000 and 10000000 randomized integers over 10 trials each. However, I've been facing a problem while trying to calculate the run time; for example, when I calculate the run time for 1 million elements it displays that it took around 300ms while in reality it's taking around 40 seconds. I'm going to leave the code of my main function, I also tried putting startTime
at the start and it would display the same running time.
public static void main(String args[]) {
Random r = new Random();
int[] arr = new int[1000000];
for (int i = 0; i < arr.length; i++)
arr[i] = r.nextInt();
long startTime = System.currentTimeMillis();
MergeSort test = new MergeSort();
test.sort(arr, 0, arr.length);
long endTime = System.currentTimeMillis();
long timeElapsed = endTime - startTime;
print(arr);
System.out.println("Execution time in milliseconds: " + timeElapsed);
}