If you create a very simple program that has a setInterval with 1 second delay, and you log the times its function is called, you will notice that the interval 'drifts'.
Basically, it actually takes (1,000ms + some amount of time) between each call.
For this program, it actually takes ~1,005ms between each call.
What causes the drift?
Is it taking 5ms to requeue setInterval?
Is it the length of the time it takes to run the function? (I doubt this, but having trouble concluding.)
Why does setInterval behave this way, and not just base itself on some clock time? (e.g. if you have 1,000ms delay and you started at time 3... just check if 1,003 then 2,003 and so on has elapsed?)
Example:
const startTime = new Date().valueOf();
function printElapsedTime(startTime) {
console.log(new Date().valueOf() - startTime);
}
let intervalObj = setInterval(printElapsedTime, 1000, startTime);
Output: 1005 2010 3015 4020
So you are not sync'd to 1 second anymore. Since it drifts by about 5, after 100 runs it will be running a half second 'later' than expected.
This question discusses how to avoid this drift, but does not explain WHY this drift is happening. (As in it does not say that setInterval is recursively adding itself to the event queue after each call - which takes 3ms ... which is just a guess at the drift cause).