I have a timer to keep track of the time the user spends doing something. There's a button that, among other things, resets the timer to zero. That works fine.
However, when the user is away (i.e. taking their 15 or 30 minute breaks), the computer must be locked. I've noticed that the time is inconsistent upon return. Some times it appears the timer doesn't move until the user returns. Sometimes it's a few seconds, or a few minutes, but is never the time the user was way - always LESS.
For instance, if I lock the PC now, and the timer is on 5 minutes and 38 seconds, and I take a 15 minute break, I'll typically return to a timer that's roughly where it was when I left, or perhaps it'll be like 6 minutes and 5 seconds.
How can I fix this? Is there any way to ensure setInterval()
continues to behave consistently, or am I better doing something else (such as comparing time stamps instead of an independent counter like I have now)?
For what it's worth, it seems even MORE inconsistent now that we are using VDI workstations, though I don't know if that makes a difference (the issue happened on normal workstations as well).
EDIT: I noticed in another question this appears to also be a thing with Chrome (maybe)? And how it throttles when not the active window or tab. Are there workarounds for --app-mode
windows?