0

I have a piece of code that I am trying to "profile" by running it in a for loop several times. I use a Stopwatch to measure the elapsed ticks and convert it into microseconds (henceforth the unit of time used below). However, I can't explain why I get such a high range for the measured timings. The results look like this:

Min: 1.0000
Max: 983.30
Avg: 1.4734

Here is the piece of code:

List<decimal> alltimes = new List<decimal>();
            
for (int i = 0; i < 1000000; i++)
{
    var stopwatch = Stopwatch.StartNew();

    // my piece of code

    stopwatch.Stop();

    decimal ticks = stopwatch.ElapsedTicks;
    decimal microseconds = (ticks / Stopwatch.Frequency) * 1000000;

    alltimes.Add(microseconds);
}
var min = alltimes.Min();
var max = alltimes.Max();
var average = alltimes.Average();

Any ideas?

schwarz
  • 501
  • 7
  • 28
  • 2
    It's entirely plausible that garbage collection takes a millisecond in one of your runs. In general I'd recommend against rolling your own benchmarking though - if you can use Benchmark.Net, that would be a better plan. (It may not be appropriate, but if you *can* use it, do.) – Jon Skeet Jul 09 '20 at 10:27
  • the information that would be needed to answer your question is what you omitted under `// my piece of code`. most probably there's one operation that just randomly hangs from time to time, or needs some resources to be initialised once - have you looked at _which_ iteration takes so long? my money is on the first one. – Franz Gleichmann Jul 09 '20 at 10:27
  • @JonSkeet, today is my lucky day. I looked a lot around to find an open source solution before using Stopwatch. Benchmark.Net looks VERY promising... – schwarz Jul 09 '20 at 12:25
  • @FranzGleichmann, not really! I tried different pieces of code, some as simple as arithmetic ops or logging and the results greatly varied! – schwarz Jul 09 '20 at 12:26

0 Answers0