45

I got this code over here:

var date = new Date();
setTimeout(function(e) {
    var currentDate = new Date();
    if(currentDate - date >= 1000) {
         console.log(currentDate, date);
         console.log(currentDate-date);
    }
    else {
       console.log("It was less than a second!");
       console.log(currentDate-date);
    }
}, 1000);

In my computer, it always executes correctly, with 1000 in the console output. Interestedly in other computer, the same code, the timeout callback starts in less than a second and the difference of currentDate - date is between 980 and 998.

I know the existence of libraries that solve this inaccuracy (for example, Tock).

Basically, my question is: What are the reasons because setTimeout does not fire in the given delay? Could it be the computer that is too slow and the browser automatically tries to adapt to the slowness and fires the event before?

PS: Here is a screenshot of the code and the results executed in the Chrome JavaScript console:

Enter image description here

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Tomás
  • 3,501
  • 3
  • 21
  • 38
  • 1
    A difference of between 1-2 milliseconds (over 1000 milliseconds) is probably attributable to jitter. Why do think that is inaccurate? It is 1/5 of one percent. – Elliott Frisch Jan 13 '14 at 17:28
  • 7
    John Resig wrote a good article on this :) http://ejohn.org/blog/accuracy-of-javascript-time/ – rorypicko Jan 13 '14 at 17:28
  • @ElliottFrisch I think is inaccurate because If I say "1 second", I want just and exactly "1 second". No more, no less. – Tomás Jan 13 '14 at 17:30
  • 1
    @Tomás Digital electronics have [jitter](http://en.wikipedia.org/wiki/Jitter). – Elliott Frisch Jan 13 '14 at 17:31
  • 1
    @Tomás This would depend on how accurate you want it to be. To say you want it to be "exactly" 1 second is not realistic. The 999ms you got is accurate to the hundredth decimal, but maybe not the thousandth. There will always be some degree of inaccuracy. – Zhihao Jan 13 '14 at 17:33
  • 1
    Because a browser isn't a real-time system. – Dave Newton Jan 13 '14 at 17:43
  • 2
    Interesting note... I was able to reproduce the _less than specified duration_ on Chrome `31.0.1650.63 m`. However, on `34.0.1782.2 canary` and every other browser I tried, I received `>= 1000` (as expected). – canon Jan 13 '14 at 17:45

5 Answers5

29

It's not supposed to be particularly accurate. There are a number of factors limiting how soon the browser can execute the code; quoting from MDN:

In addition to "clamping", the timeout can also fire later when the page (or the OS/browser itself) is busy with other tasks.

In other words, the way that setTimeout is usually implemented, it is just meant to execute after a given delay, and once the browser's thread is free to execute it.

However, different browsers may implement it in different ways. Here are some tests I did:

var date = new Date();
setTimeout(function(e) {
    var currentDate = new Date();
    console.log(currentDate-date);
}, 1000);

// Browser Test1 Test2 Test3 Test4
// Chrome    998  1014   998   998
// Firefox  1000  1001  1047  1000
// IE 11    1006  1013  1007  1005

Perhaps the < 1000 times from Chrome could be attributed to inaccuracy in the Date type, or perhaps it could be that Chrome uses a different strategy for deciding when to execute the code—maybe it's trying to fit it into the a nearest time slot, even if the timeout delay hasn't completed yet.

In short, you shouldn't use setTimeout if you expect reliable, consistent, millisecond-scale timing.

p.s.w.g
  • 146,324
  • 30
  • 291
  • 331
  • 2
    In short, you shouldn't use `JavaScript` if you expect reliable, consistent, millisecond-scale anything. There really isn't any other alternative in the language. – Josh Hibschman Mar 04 '22 at 01:50
21

In general, computer programs are highly unreliable when trying to execute things with higher precision than 50 ms. The reason for this is that even on an octacore hyperthreaded processor the OS is usually juggling several hundreds of processes and threads, sometimes thousands or more. The OS makes all that multitasking work by scheduling all of them to get a slice of CPU time one after another, meaning they get 'a few milliseconds of time at most to do their thing'.

Implicity this means that if you set a timeout for 1000 ms, chances are far from small that the current browser process won't even be running at that point in time, so it's perfectly normal for the browser not to notice until 1005, 1010 or even 1050 milliseconds that it should be executing the given callback.

Usually this is not a problem, it happens, and it's rarely of utmost importance. If it is, all operating systems supply kernel level timers that are far more precise than 1 ms, and allow a developer to execute code at precisely the correct point in time. JavaScript however, as a heavily sandboxed environment, doesn't have access to kernel objects like that, and browsers refrain from using them since it could theoretically allow someone to attack the OS stability from inside a web page, by carefully constructing code that starves other threads by swamping it with a lot of dangerous timers.

As for why the test yields 980 I'm not sure - that would depend on exactly which browser you're using and which JavaScript engine. I can however fully understand if the browser just manually corrects a bit downwards for system load and/or speed, ensuring that "on average the delay is still about the correct time" - it would make a lot of sense from the sandboxing principle to just approximate the amount of time required without potentially burdening the rest of the system.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Niels Keurentjes
  • 41,402
  • 9
  • 98
  • 136
  • Indeed, the problem is not specific to JavaScript. And an accurate of 50ms or less for a Browser executing an interpreted language, I think we're already pretty darn good. – Alexis Wilke Dec 16 '15 at 09:10
  • Hrm.. I can achieve pretty reliable nanosecond time precision in Golang and C. Maybe it's more related to the language than general computing? – Josh Hibschman Mar 04 '22 at 01:52
  • @JoshHibschman computers and processors improved a bit in the more than 8 years that passed since the original answer was posted. Still, nanosecond precision is impossible, Golang and C are also not exempt from context switches. Once the kernel pauses your process, you're fucked, in every language. – Niels Keurentjes Jul 22 '22 at 00:38
  • Merely poking a jab here single-threaded langs, which is the main limitation of context-switches, i.e. you can't dictate what happens on next tick. However in C/Go/etc just share a mutex been two threads, one keeps time while the other works. Then if your sys can for-loop to a billion in < 1 second, you can implement nanosecond precision. – Josh Hibschman Jul 22 '22 at 16:47
7

Someone please correct me if I am misinterpreting this information:

According to a post from John Resig regarding the inaccuracy of performance tests across platforms (emphasis mine)

With the system times constantly being rounded down to the last queried time (each about 15 ms apart) the quality of performance results is seriously compromised.

So there is up to a 15 ms fudge on either end when comparing to the system time.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Evan Davis
  • 35,493
  • 6
  • 50
  • 57
  • 2
    That's one aspect of the problem. Another issue is the asynchronous nature of the browser, so depending on whatever else is in the queue, the timeout may be put on hold to deal with other jobs. – EmptyArsenal Jan 13 '14 at 17:32
1

I had a similar experience.
I was using something like this:

var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000));
setTimeout(function ()
{
    CountDownClock(ElementID, RelativeTime);
}, iMillSecondsTillNextWholeSecond);//Wait until the next whole second to start.

I noticed it would Skip a Second every couple Seconds, sometimes it would go for longer.
However, I'd still catch it Skipping after 10 or 20 Seconds and it just looked rickety.
I thought, "Maybe the Timeout is too slow or waiting for something else?".
Then I realized, "Maybe it's too fast, and the Timers the Browser is managing are off by a few Milliseconds?"

After adding +1 MilliSeconds to my Variable I only saw it skip once.
I ended up adding +50ms, just to be on the safe side.

var iMillSecondsTillNextWholeSecond = (1000 - (new Date().getTime() % 1000) + 50);

I know, it's a bit hacky, but my Timer is running smooth now. :)

MikeTeeVee
  • 18,543
  • 7
  • 76
  • 70
0

Javascript has a way of dealing with exact time frames. Here’s one approach:

You could just save a Date.now when you start to wait, and create an interval with a low ms update frame, and calculate the difference between the dates.

Example:

const startDate = Date.now()

const interval = setInterval(() => {
  const currentDate = Date.now()

  if (currentDate - startDate === 1000 {
    // it was a second

    clearInterval(interval)
    return
  } 

  // it was not a second
}, 50)