I wanted to track the performance of my code so I stored the start and end time using System.DateTime.Now
. I took the difference between the two as the time my code to execute.
I noticed though that the difference didn't appear to be accurate. So I tried using a Stopwatch
object. This turned out to be much, much more accurate.
Can anyone tell me why Stopwatch
would be more accurate than calculating the difference between a start and end time using System.DateTime.Now
?
BTW, I'm not talking about a tenths of a percent. I get about a 15-20% difference.