I have a piece of code that I am trying to "profile" by running it in a for loop several times. I use a Stopwatch to measure the elapsed ticks and convert it into microseconds (henceforth the unit of time used below). However, I can't explain why I get such a high range for the measured timings. The results look like this:
Min: 1.0000
Max: 983.30
Avg: 1.4734
Here is the piece of code:
List<decimal> alltimes = new List<decimal>();
for (int i = 0; i < 1000000; i++)
{
var stopwatch = Stopwatch.StartNew();
// my piece of code
stopwatch.Stop();
decimal ticks = stopwatch.ElapsedTicks;
decimal microseconds = (ticks / Stopwatch.Frequency) * 1000000;
alltimes.Add(microseconds);
}
var min = alltimes.Min();
var max = alltimes.Max();
var average = alltimes.Average();
Any ideas?