I've been experiencing some strange behavior with a web app I've been working on where the app doesn't function as expected when a user switches tabs. After some googling, I found the comments section on this question: How do browsers pause/change Javascript when tab or window is not active? which indicated that time outs under 1000 ms are increased to 1000ms when page focus is lost. This behavior would explain a lot of quirks that I've seen in my web app where the console.time feature was inexplicably reading out 1000ms times. After searching google I haven't been able to find any articles detailing this behavior and I haven't found a way to prevent it. Does anyone happen to know where I could find more info or how I might prevent chrome from increasing my timeouts on page blur?
EDIT: To answer the useless "Why?" response, it's a WebRTC app that I've been working on as a novelty. If certain timers in my page stop working the app is no longer a web based real time communication app, and is a web based delayed communication app.
Thanks to apsillers for finding the solution to this problem. This link: setTimeout/setInterval 1000ms lag in background tabs (Chrome and Firefox) details how web workers can circumvent this behavior.