I am analyzing the time efficiency of an algorithm, where the algorithm is ran with a random input array of a set size for however many loops there are tests. I am confused as to why the first loop takes monumentally longer than all following loops (tests).
I am running C# using system.diagnostics' stopwatch class. In the algorithm testing function, the stopwatch starts in the line before and ends the line after with no other code chunks. The debugging result depicted in the console show that the first test always takes much longer than all the rest. My knowledge in computer science is limited so apologies if the answer happens to be a no-brainier, but I was not able to find much useful information when googling "Algorithm time efficiency, why first loop takes longer than all others?".
void RunAlgorithm(int numTests =10, int arraySize = 9, int max =2, int min = -2)
{
for(int i = 0; i< numTests; i++)
{
int[] A = new int[arraySize];
for (int j = 0; j < arraySize; j++)
{
A[j] = rng.Next(min, max);
}
Array.Sort(A);
//run tests and record time
timer.Start();
algorithmResult = Algorithm(A);
timer.Stop();
timeTaken= (float)timer.Elapsed.TotalMilliseconds;
totalTime += timeTaken;
timer.Reset();
MockReport(timeTaken);
}
}
in the output, the very first test took 0.3853 seconds whereas all other tests averaged 0.0015 seconds