0

So I have this single page application that does a lot of computation every time the user does an action like click a button. As javascript is a not threaded, the lengthy calc blocks the UI updates and creates a bad user experience:

$('update-btn').click() {
  updateDomWithAnimation();
  doLenghtyCalc();
}

After reading perhaps one too many articles on how to handle this, I find that wrapping some of the function calls with window.setTimeout does help. So armed with this knowledge I have started wrapping them up and it seems to bring some responsiveness back to the browser.

However my question is, are there any adverse side effects of having too many timeout statements even if the delay is only 0 or 1? From a business logic perspective I am making sure only independent standalone functions are wrapped in setTimeout. I wanted to check from a technical viewpoint. Can any JS gurus share some insight?

P.S: I had taken a look at Web Workers, but my code is built using Jquery and depends heavily on DOM states etc so implementing web workers atm would not be possible which is why I am using timeouts Much appreciated!

Undefined Variable
  • 4,196
  • 10
  • 40
  • 69
  • might want to look into using [Web Workers API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers) – charlietfl Jan 21 '16 at 18:47
  • js is running over client device.. so if your code screw someone it would be the client only... i still can't figure why would you need such a delay... maybe you can tell us more details about your scenario – ymz Jan 21 '16 at 18:48
  • @charlietfl: thank you for the link,I had taken a look at it, but my code is built using Jquery and depends heavily on DOM states etc so implementing web workers atm would not be possible which is why I am using timeouts – Undefined Variable Jan 21 '16 at 18:49

1 Answers1

0

While technically it's ok to have several timeouts running it's generally advisable to not have too many.

One thing we did was to have a single timeout/setInterval each that when fired runs a set of functions that can be added or removed at anytime.

///Somewhere
var runnableFunctions = []
var runningIntervalID = window.setInterval(function() { 
    runnableFunctions.forEach(function(func) {
        if(typeof func === 'function') {
            func.call(null);
        }
    }, 1);


/// Elsewhere
$(domElem).on(event, function() {
    runnableFucntions.push(function() {
        //Do something on interval
        // slice out this function if it only needs to run once.
    });
});

This is just a dirty example but you get the idea where you can shove functions into an array and have them run in a single timeout/interval vs setting up many timeouts/intervals and then remembering to stop them later.

JeffBaumgardt
  • 1,378
  • 3
  • 16
  • 24
  • Furthermore to make it more elaborate change the array from being an array of functions to an array of objects which contain said function. It will make it easier to target to remove later on, or check if one is already registered. – JeffBaumgardt Jan 21 '16 at 19:00
  • @Amit before I joined my current company they had over 100 registered timers running to do various things. Some blocking, some not. When the functions start stacking up like that we lost fidelity in the timers themselves. A set interval running at 1000 started to fire ~10 - 20% longer. – JeffBaumgardt Jan 21 '16 at 20:09
  • While not the best reference material I could think of, look [here](http://stackoverflow.com/a/13616530/3887516). At the very least this shows the problem you're describing is well known and solvable. Whether any specific browser implements a solid architecture to mitigate this issue is hard to tell - but I would suggest thoroughly testing any such assumptions before jumping to conclusions. – Amit Jan 21 '16 at 21:14