1

I found a function to create a self correcting interval in JS since setInterval cannot be trusted to be accurate. When logging out the nextTick, when the interval is 1 second, 1 100th of a second, or 1 10th of a second, the nextTick, which is the drift fixer upper (to be technical) prints as expected.

Instead of the next tick being 1000 miliseconds, it's usually 999, sometimes 998. Following the aforementioned intervals, you see values such as 99, 98 for 100ths, and usually 9 for 10ths of a second.

Now, when it comes to milliseconds - that is creating an interval that is supposed to be called every millisecond, the nextTick is printing out NaN or negative numbers. I supposed this is just because the interval is so small. I am wondering if its even possible to get accuracy down to 1 ms

class Interval {

    constructor(interval, onTick){
        this.interval = interval;
        this.onTick = onTick || function(){};
        this.timer = false;
        this.ticks = 0;
        this.startTime = 0;
        this.currentTime = 0;
        this.elapsedTime = 0;
        return this;
    }

    run(){
        this.currentTime = Date.now();
        if(!this.startTime){
            this.startTime = this.currentTime;
        }

        this.onTick();

        let nextTick = this.interval - (this.currentTime - (this.startTime + (this.ticks * this.interval)));
        //console.log(nextTick);
        this.ticks++;

        let self = this;
        this.timer = setTimeout(function(){
            self.run();
        }, nextTick);

        return this;
    }

    start(){
        this.run();
        return this;
    }

    stop(){
        clearTimeout(this.timer);
        return this;
    }
}

Original

Fiddle

In the fiddle you could uncomment one by one the interval constructors to see what I mean.

Dexygen
  • 12,287
  • 13
  • 80
  • 147
jozenbasin
  • 2,142
  • 2
  • 14
  • 22
  • 3
    No, it's not possible to have such highly accurate timers in JavaScript, or any non-realtime system. Things that need this sort of accuracy (such as audio timing) have to schedule events. Also, check this out to see if this is helpful: https://stackoverflow.com/questions/20341008/deriving-a-more-accurate-clock-from-an-inaccurate-clock – Brad Oct 25 '17 at 02:21
  • 1
    If you just want a non-drifting clock, see [How to create an accurate timer in javascript?](https://stackoverflow.com/a/29972322/1048572) If you want a 1ms interval, that's hardly possible without a realtime OS. – Bergi Oct 25 '17 at 02:23
  • 2
    Apart from what is said in the previous comment, use performance.now() instead of Date.now() for high precision timers –  Oct 25 '17 at 02:23
  • 1
    @jozenbasin Sorry I missed your question "I am wondering if its even possible to get accuracy down to 1 ms", you are getting plenty of feedback from others on it. I've retracted my close vote, and will retract my down vote right now – Dexygen Oct 25 '17 at 02:31
  • 2
    mentioned in the documentation https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/setTimeout#Reasons_for_delays_longer_than_specified. Not sure if related, but on rare occasions `Date.now() - Date.now()` can return -1 – Slai Oct 25 '17 at 02:33
  • [JS doesn't really support timeouts under 4ms](https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/setTimeout#Reasons_for_delays_longer_than_specified). – nnnnnn Oct 25 '17 at 03:14
  • 1
    It turns out that this depends more on the OS than on the language. I haven't tested js in particular but in Tcl (which has an event loop mechanism similar to js) I can get 1ms resolution timers on Windows but I usually get around 5 or 10ms resolution timers on Linux. Windows, due to it's huge games market and dev support, is designed more like a real-time OS while Linux, due to it's huge web server market and dev support, is designed more like a highly parallel batch processing system optimized for throughput. – slebetman Oct 25 '17 at 04:08
  • @slebetman +1 for Tcl – Dexygen Oct 25 '17 at 09:46

0 Answers0