0

PLEASE NOTE: The linked to question does NOT answer my question. Bergi's answer is the correct answer to what I am asking.

To start, please take a look at this simple CountdownTimer:

http://jsfiddle.net/49tH7/ (see below for permanent code)

Make sure to open your console so you can see it counting down. I tested using the latest version of chrome.

Explanation of Code

Note the way we call the timer:

timer = CountdownTimer(5000, 5, function(tickNum) { console.log(tickNum) });
timer.start();

We pass the total time the timer should take to count down (5000 ms), the total number of ticks it should complete (in this case 5 -- 1 per second), and a callback function to be invoked at each tick. The callback is passed the number of the tick, and in this case it just prints that out. When the code is finished executing, the onFinish() method is called, which simply prints out the total number of ms the timer has just run. In theory this should always be very close to the first argument passed into the CountdownTimer constructor, in this case 5000.

The time between ticks is meant to be adaptive, taking into account the time taken to execute the callback. If callback code is slow and complex on average, the optimalPauseTime() method will detect that, and start adjusting down the time between ticks, so that the timer still completes within the total time specified in the first argument to CountdownTimer. Hence total running time of the timer and total number of ticks are fixed quantities, but the time between ticks is dynamic and variable.

Strange Behavior and My Questions

To see the strangeness motivating this post, crank up the number of ticks in the fiddle to 1000 and break out your phone's stopwatch:

timer = CountdownTimer(5000, 1000, function(tickNum) { console.log(tickNum) });
timer.start();

Start your stopwatch as soon as you press run, and stop it as soon as you see the console output end. Obviously this is a rough estimation that might be off by a second, but it's good enough to see what's going on. The final output of the log looks like this:

999 (index):54
1000 (index):54
5004 

and my stopwatch reads 8.5 seconds. I did the test a number of times with similar results.

Question 1 How is it possible that the javascript code is claiming that it took only 5004 ms to run, when it actually took a few seconds longer than that?

Question 2 To see an exaggerated version of the problem and watch your CPU shoot up, crank the ticks up to 2000 and run it again. In this case, it took about 25 seconds to run by my stopwatch, and the final number output was 8727. So now the code is able to detect that its running more slowly, but it is still underestimating that slowness a lot. What is going on that explains all this?

JSFiddle Code

function CountdownTimer(totalMs, totalTicks, tickCb) {

    var ret = {}, startTime, ticksCompleted;

    function tick() {
      if (ticksCompleted == totalTicks) { onFinish(); return; }
      ticksCompleted++;
      tickCb(ticksCompleted);
      setTimeout(function() {tick()}, optimalPauseTime());
    }

    function optimalPauseTime() {
      var timeElapsed = new Date() - startTime;
      var avgTickTimeSoFar = timeElapsed / ticksCompleted;
      var ticksRemaining = totalTicks - ticksCompleted;
      var timeRemaining = totalMs - timeElapsed;
      return timeRemaining / ticksRemaining;
    }

    function start() { 
        startTime = new Date(); 
        ticksCompleted = 0;
        tick(); 
    }

    function onFinish() {
      console.log(new Date() - startTime);
    }

    ret.start = start;
    return ret;
}

timer = CountdownTimer(5000, 5, function(tickNum) { console.log(tickNum) });
timer.start();
Jonah
  • 15,806
  • 22
  • 87
  • 161
  • possible duplicate of [Is there a more accurate way to create a Javascript timer than setTimeout?](http://stackoverflow.com/questions/196027/is-there-a-more-accurate-way-to-create-a-javascript-timer-than-settimeout) – Quentin Mar 06 '14 at 17:22
  • This is really not a duplicate of the linked question, especially if you read Bergi's answer. I spent about 30 minutes writing this question and making sure it was very clear... voting to close without carefully reading is not fair or helpful to the site. – Jonah Mar 06 '14 at 18:03

1 Answers1

2

How is it possible that the javascript code is claiming that it took only 5004 ms to run, when it actually took a few seconds longer than that?

I cannot reproduce. Maybe your stopwatch (or the process that triggers it?) is inaccurate; new Date works quite well usually.

Maybe you are also just experiencing the console delay, it is not constantly updated to run the page smoothly. Displaying lots of messages in the console can be a bottleneck (writing them to the console buffer should not).

To see an exaggerated version of the problem and watch your CPU shoot up, crank the ticks up to 2000 and run it again. In this case, it took about 25 seconds to run by my stopwatch, and the final number output was 8727. So now the code is able to detect that its running more slowly, but it is still underestimating that slowness a lot. What is going on that explains all this?

There's a minimum timeout implemented in browsers. Having 2000 ticks in 5s expects a tick every 2.5ms, which is too low. A browser just cannot schedule ticks that fast.

Also, the onFinish() call is always deferred one further tick, i.e. it's usually about 4ms after the actual end.

Also, your optimalPauseTime does not aim for exactness, but rather for equal distribution of timeouts. When it's late and tries to catch up, it doesn't catch up immediately - but distributes this catch-up over the remaining ticks. Instead, you might do something like

function tick() {
  ticksCompleted++;
  tickCb(ticksCompleted);
  if (ticksCompleted == totalTicks) {
    onFinish();
  } else {
    var pause = optimalPauseTime();
    if (pause <= 0)
      tick();
    else
      setTimeout(tick, pause);
  }
}

function optimalPauseTime() {
  var timeElapsed = new Date() - startTime;
  var nextTickRatio = (ticksCompleted + 1) / totalTicks;
  var expectedNextTime = totalMs * nextTickRatio;
  return expectedNextTime - timeElapsed;
}

(updated demo)

Bergi
  • 630,263
  • 148
  • 957
  • 1,375
  • This is a great answer, and I think the "browser just cannot schedule ticks that fast" gets to the heart of my questions. One thing I did notice in your updated demo: it runs even slower, and 1000 ticks took over 20 seconds for me (vs the 8 with my original). I assume this is because the logic in your optimal pause method is more intensive? – Jonah Mar 06 '14 at 18:06
  • Uh, what? When I'm testing in Chrome, I'm getting `5000±3` always. No, my `optimalPause` method is definitely not intensive. What could be intensive is displaying >2000 lines in the console… do you measure the time that the console needs to update? Try to remove the log statement from the ticks. – Bergi Mar 06 '14 at 18:14
  • Ok, you are right. The bottleneck is writing that much to chrome console. When I update a div instead it works great: http://jsfiddle.net/vdY7M/. So I actually think this console bottleneck is the other piece of the explanation to my original questions... you might want to add to your answer. – Jonah Mar 06 '14 at 18:23