PLEASE NOTE: The linked to question does NOT answer my question. Bergi's answer is the correct answer to what I am asking.
To start, please take a look at this simple CountdownTimer:
http://jsfiddle.net/49tH7/ (see below for permanent code)
Make sure to open your console so you can see it counting down. I tested using the latest version of chrome.
Explanation of Code
Note the way we call the timer:
timer = CountdownTimer(5000, 5, function(tickNum) { console.log(tickNum) });
timer.start();
We pass the total time the timer should take to count down (5000 ms), the total number of ticks it should complete (in this case 5 -- 1 per second), and a callback function to be invoked at each tick. The callback is passed the number of the tick, and in this case it just prints that out. When the code is finished executing, the onFinish()
method is called, which simply prints out the total number of ms the timer has just run. In theory this should always be very close to the first argument passed into the CountdownTimer
constructor, in this case 5000.
The time between ticks is meant to be adaptive, taking into account the time taken to execute the callback. If callback code is slow and complex on average, the optimalPauseTime()
method will detect that, and start adjusting down the time between ticks, so that the timer still completes within the total time specified in the first argument to CountdownTimer
. Hence total running time of the timer and total number of ticks are fixed quantities, but the time between ticks is dynamic and variable.
Strange Behavior and My Questions
To see the strangeness motivating this post, crank up the number of ticks in the fiddle to 1000 and break out your phone's stopwatch:
timer = CountdownTimer(5000, 1000, function(tickNum) { console.log(tickNum) });
timer.start();
Start your stopwatch as soon as you press run, and stop it as soon as you see the console output end. Obviously this is a rough estimation that might be off by a second, but it's good enough to see what's going on. The final output of the log looks like this:
999 (index):54
1000 (index):54
5004
and my stopwatch reads 8.5 seconds. I did the test a number of times with similar results.
Question 1 How is it possible that the javascript code is claiming that it took only 5004 ms to run, when it actually took a few seconds longer than that?
Question 2 To see an exaggerated version of the problem and watch your CPU shoot up, crank the ticks up to 2000 and run it again. In this case, it took about 25 seconds to run by my stopwatch, and the final number output was 8727. So now the code is able to detect that its running more slowly, but it is still underestimating that slowness a lot. What is going on that explains all this?
JSFiddle Code
function CountdownTimer(totalMs, totalTicks, tickCb) {
var ret = {}, startTime, ticksCompleted;
function tick() {
if (ticksCompleted == totalTicks) { onFinish(); return; }
ticksCompleted++;
tickCb(ticksCompleted);
setTimeout(function() {tick()}, optimalPauseTime());
}
function optimalPauseTime() {
var timeElapsed = new Date() - startTime;
var avgTickTimeSoFar = timeElapsed / ticksCompleted;
var ticksRemaining = totalTicks - ticksCompleted;
var timeRemaining = totalMs - timeElapsed;
return timeRemaining / ticksRemaining;
}
function start() {
startTime = new Date();
ticksCompleted = 0;
tick();
}
function onFinish() {
console.log(new Date() - startTime);
}
ret.start = start;
return ret;
}
timer = CountdownTimer(5000, 5, function(tickNum) { console.log(tickNum) });
timer.start();