The main logic of a utility tool is in a function like this:
private void Run()
{
DateTime startTime = DateTime.Now;
Prepare();
Search();
Process();
DateTime endTime = DateTime.Now;
TimeSpan duration = endTime.Subtract(startTime);
Console.WriteLine("Run took {0:00}:{1:00}:{2:00}",
(int)duration.TotalHours, duration.Minutes, duration.Seconds);
}
When I run this I can see with my own eyes it is taking at least 5 seconds (Process()
method spews console output which I can observe happening for 5-6s). But it reports "Run took 00:00:01".
I don't expect time to have microsecond precision but why is it so totally inaccurate here?
Update:
Following advice I also ran a StopWatch
over the same period and compared against subtracting two DateTime
, and also debugged the code. The two methods agree to a fraction of a second... StopWatch
in the debugger had 1139 milliseconds. My hypothesis is that somehow time writing to the console is not included but I have no way to back that up (or disprove it).