108

I just ran into some unexpected behavior with DateTime.UtcNow while doing some unit tests. It appears that when you call DateTime.Now/UtcNow in rapid succession, it seems to give you back the same value for a longer-than-expected interval of time, rather than capturing more precise millisecond increments.

I know there is a Stopwatch class that would be better suited for doing precise time measurements, but I was curious if someone could explain this behavior in DateTime? Is there an official precision documented for DateTime.Now (for example, precise to within 50 ms?)? Why would DateTime.Now be made less precise than what most CPU clocks could handle? Maybe it's just designed for the lowest common denominator CPU?

public static void Main(string[] args)
{
    var stopwatch = new Stopwatch();
    stopwatch.Start();
    for (int i=0; i<1000; i++)
    {
        var now = DateTime.Now;
        Console.WriteLine(string.Format(
            "Ticks: {0}\tMilliseconds: {1}", now.Ticks, now.Millisecond));
    }

    stopwatch.Stop();
    Console.WriteLine("Stopwatch.ElapsedMilliseconds: {0}",
        stopwatch.ElapsedMilliseconds);

    Console.ReadLine();
}
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Andy White
  • 86,444
  • 48
  • 176
  • 211
  • What's the interval during which you get back the same value? – ChrisF Jan 26 '10 at 22:26
  • 3
    Read http://stackoverflow.com/questions/307582/how-frequent-is-datetime-now-updated-or-is-there-a-more-precise-api-to-get-the – Dan Diplo Jan 26 '10 at 22:27
  • When I ran the above code, I got only 3 unique values for ticks and milliseconds, and the final stop watch time was 147 ms, so it appears that on my machine it's only precise to around 50ms... – Andy White Jan 26 '10 at 22:34
  • I should say, the loop ran a bunch of times, but I only saw 3 distinct values... – Andy White Jan 26 '10 at 22:42
  • For anyone coming here, here is the TL;DR// Use the QueryPerformanceCounter function "Retrieves the current value of the performance counter, which is a high resolution (<1us) time stamp that can be used for time-interval measurements." (For managed code, the System.Diagnostics.Stopwatch class uses QPC as its precise time basis.) http://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx – AnotherUser Jun 03 '14 at 18:00
  • I ran the same test and got different result. I got more than 3 unique values. One other thing, if I change the code inside the loop to `Console.WriteLine(String.Format("Ticks: {0} Milliseconds: {1} Stopwatch: {2}", now.Ticks, now.Millisecond, stopwatch.ElapsedMilliseconds))` than the stopwatch value changes with the exact same interval as the DateTime value. Maybe it's the `Console.WriteLine` who's doing a not so good job instead of the DateTime. – Martin Jul 19 '18 at 07:22

7 Answers7

210

Why would DateTime.Now be made less precise than what most CPU clocks could handle?

A good clock should be both precise and accurate; those are different. As the old joke goes, a stopped clock is exactly accurate twice a day, a clock a minute slow is never accurate at any time. But the clock a minute slow is always precise to the nearest minute, whereas a stopped clock has no useful precision at all.

Why should the DateTime be precise to, say a microsecond when it cannot possibly be accurate to the microsecond? Most people do not have any source for official time signals that are accurate to the microsecond. Therefore giving six digits after the decimal place of precision, the last five of which are garbage would be lying.

Remember, the purpose of DateTime is to represent a date and time. High-precision timings is not at all the purpose of DateTime; as you note, that's the purpose of StopWatch. The purpose of DateTime is to represent a date and time for purposes like displaying the current time to the user, computing the number of days until next Tuesday, and so on.

In short, "what time is it?" and "how long did that take?" are completely different questions; don't use a tool designed to answer one question to answer the other.

Thanks for the question; this will make a good blog article! :-)

Eric Lippert
  • 647,829
  • 179
  • 1,238
  • 2,067
  • 2
    @Eric Lippert: Raymond Chen has an oldie but goodie on the very topic of the difference between "precision" and "accuracy": http://blogs.msdn.com/oldnewthing/archive/2005/09/02/459952.aspx – jason Jan 27 '10 at 00:40
  • 3
    Ok, good point about precision vs. accuracy. I guess I still don't really buy the statement that DateTime is not accurate because "it doesn't have to be." If I have a transactional system, and I want to mark a datetime for each record, to me it seems intuitive to use the DateTime class, but it seems that there are more accurate/precise time components in .NET, so why would DateTime be made less capable. I guess I'll have to do some more reading... – Andy White Jan 27 '10 at 16:32
  • 12
    OK @Andy, suppose you do have such a system. On one machine you mark a transaction as occurring at January 1st, 12:34:30.23498273. On another machine in your cluster you mark a transaction as occurring at January 1st, 12:34:30.23498456. Which transaction occurred first? Unless you know that the two machines clocks are synchronized to within a microsecond of each other, you have no idea which one occurred first. The extra precision is *misleading garbage*. If I had my way, all DateTimes would be rounded to the nearest second, as they were in VBScript. – Eric Lippert Jan 27 '10 at 16:52
  • 15
    not that this would resolve the problem you mention, since average unsynchronized PCs are usually out by *minutes*. Now if rounding to 1s doesn't solve anything then why round at all? In other words, I don't follow your argument for why the absolute values should have a smaller precision than the accuracy of delta time measurements. – Roman Starkov Feb 10 '11 at 22:23
  • 3
    Let's say I'm creating an activity log that requires (1) knowing when something occurred in terms of calendar space (within a few seconds) (2) knowing very exactly the spacing between events (within 50 or so milliseconds). It sounds like the safest bet for this would be to use DateTime.Now for the timestamp of the first action, then use a Stopwatch for subsequent actions to determine the offset from the initial DateTime. Is this the approach you would advise, Eric? – devuxer Aug 06 '13 at 17:38
  • This helped me understand the best the difference between precision and accuracy: [Explanation with a deviation graph](http://upload.wikimedia.org/wikipedia/commons/thumb/3/38/Accuracy_and_precision.svg/300px-Accuracy_and_precision.svg.png) – msysmilu Nov 11 '14 at 10:43
  • 1
    The precision of the DateTime struct and the accuracy of DateTime.Now are two separate issues. I don't think you can assume that every DateTime is generated by DateTime.Now. In my application I find the 100ns tick resolution of the DateTime struct very useful. – Mark Sep 04 '15 at 00:42
  • 1
    One reason that DateTime maybe less accurate than some other time sources available is that DateTime has a bigger range. If you are writing software that deals with very old dates, or dates in the far future then DateTime works really well for this. You can't expect one solution to work for 10 thousand of years of elapsed time and also be accurate to the nearest millisecond. – bikeman868 Nov 06 '16 at 06:07
  • 2
    You know, one of the many things I miss about my old Amiga is the way it handled system times. Yes, the system clock had a resolution that was considerably more coarsely grained than the values returned, but the values returned were guaranteed to be unique and monotonically increasing. The system clock would return a time to an 1/18th of a second, but if you called for the system time multiple times in 1/18th of a second, you'd get the system clock +0, system clock +1, system clock +2, etc. – Jeff Dege Aug 20 '18 at 22:38
  • @JeffDege: I did very little systems-level programming on my Amiga; that is an interesting property. Thanks for the note. – Eric Lippert Aug 20 '18 at 22:39
  • @JeffDege: You could make a DateTime generator that has the same property as the Amiga's system times because internally DateTime stores ticks. If DateTime.Now returns the same as the last call to DateTime.Now then add a tick and return that instead. Someone calling it in a tight loop could force a little drift... – Oliver Bock May 11 '19 at 11:06
  • @EricLippert - Oldie but goodie... You are 100% correct when going across machines. But when on a single machine, it is a different story... I needed to deal with times of insertion into a (concurrent) queue. Strict ordering was ensured.. I need a true Date Time, which could have a margin of error in the 100mS range, but also needed interval timing on the order of microseconds... Ended up having to use 2 distinct sources as there was not one element that could match both requirements – David V. Corbin Jun 04 '21 at 14:39
19

DateTime's precision is somewhat specific to the system it's being run on. The precision is related to the speed of a context switch, which tends to be around 15 or 16 ms. (On my system, it is actually about 14 ms from my testing, but I've seen some laptops where it's closer to 35-40 ms accuracy.)

Peter Bromberg wrote an article on high precision code timing in C#, which discusses this.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Reed Copsey
  • 554,122
  • 78
  • 1,158
  • 1,373
  • 2
    The 4 Win7 machines that I've had over the years have all had roughly a 1ms accuracy. Now() sleep(1) Now() always resulted in a ~1ms change in datetime when I was testing. – Bengie May 14 '15 at 20:57
12

I would like a precise Datetime.Now :), so I cooked this up:

public class PreciseDatetime
{
    // using DateTime.Now resulted in many many log events with the same timestamp.
    // use static variables in case there are many instances of this class in use in the same program
    // (that way they will all be in sync)
    private static readonly Stopwatch myStopwatch = new Stopwatch();
    private static System.DateTime myStopwatchStartTime;

    static PreciseDatetime()
    {
        Reset();

        try
        {
            // In case the system clock gets updated
            SystemEvents.TimeChanged += SystemEvents_TimeChanged;
        }
        catch (Exception)
        {                
        }
    }

    static void SystemEvents_TimeChanged(object sender, EventArgs e)
    {
        Reset();
    }

    // SystemEvents.TimeChanged can be slow to fire (3 secs), so allow forcing of reset
    static public void Reset()
    {
        myStopwatchStartTime = System.DateTime.Now;
        myStopwatch.Restart();
    }

    public System.DateTime Now { get { return myStopwatchStartTime.Add(myStopwatch.Elapsed); } }
}
Jimmy
  • 5,131
  • 9
  • 55
  • 81
  • 2
    I like this solution, but I wasn't sure, so I asked my own question (http://stackoverflow.com/q/18257987/270348). As per the comment/answer from Servy, you shouldn't ever reset the stopwatch. – RobSiklos Aug 15 '13 at 17:38
  • 1
    Maybe not in your context, but resetting makes sense in my context -- I just make sure its done before timing actually begins. – Jimmy Aug 15 '13 at 22:54
  • You do not need the subscription and you do not need to reset stopwatch. Running this code every ~10ms is not necessary and consumes CPU. And this code is not thread-safe at all. Just initialize myStopwatchStartTime = DateTime.UtcNow; once, in static constructor. – VeganHunter May 08 '18 at 07:06
  • 1
    @VeganHunter I'm not sure if I'm understanding your comment right, but you seem to think that TimeChanged gets called every ~10ms? It doesn't. – Jimmy May 08 '18 at 13:07
  • @Jimmy, you are right. My bad, I misunderstood the code. SystemEvents.TimeChanged event is called only if user changes system time. It is a rare event. – VeganHunter May 08 '18 at 23:49
  • Does anyone use this code in (near-)production software? How does the Now property behave in terms of performance (memory/cpu usage)? Is the .Add operation significant? – thomasgalliker Apr 28 '22 at 04:35
  • @thomasgalliker I didn't measure it, but I doubt that adding a TimeSpan to a DateTime would be a lengthy operation. – Jimmy Apr 28 '22 at 20:08
  • Sadly I cant use this code since SystemEvents.TimeChanged is not available in netstandard2.0 – thomasgalliker Apr 29 '22 at 06:48
7

For what it's worth, short of actually checking the .NET source, Eric Lippert provided a comment on this SO question saying that DateTime is only accurate to approx 30 ms. The reasoning for not being nanosecond accurate, in his words, is that it "doesn't need to be."

Pang
  • 9,564
  • 146
  • 81
  • 122
Scott Arrington
  • 12,325
  • 3
  • 42
  • 54
  • 4
    And it could be worse. In VBScript the Now() function rounds the returned result to the nearest second, despie the fact that the value returned has sufficient available precision to be precise to the microsecond. In C#, the structure is called DateTime; it's intended to represent a date and a time for typical real-world non-scientific domains, like when your life insurance expires or how long its been since your last reboot. It's not intended for high-precision sub-second timing. – Eric Lippert Jan 27 '10 at 00:27
6

From MSDN you'll find that DateTime.Now has an approximate resolution of 10 milliseconds on all NT operating systems.

The actual precision is hardware dependent. Better precision can be obtained using QueryPerformanceCounter.

Kevin Montrose
  • 22,191
  • 9
  • 88
  • 137
3

From MSDN documentation:

The resolution of this property depends on the system timer.

They also claim that the approximate resolution on Windows NT 3.5 and later is 10 ms :)

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Tomas Vana
  • 18,317
  • 9
  • 53
  • 64
2

The resolution of this property depends on the system timer, which depends on the underlying operating system. It tends to be between 0.5 and 15 milliseconds.

As a result, repeated calls to the Now property in a short time interval, such as in a loop, may return the same value.

MSDN Link

Iman
  • 17,932
  • 6
  • 80
  • 90