57

I have JavaScript which performs a whole lot of calculations as well as reading/writing values from/to the DOM. The page is huge so this often ends up locking the browser for up to a minute (sometimes longer with IE) with 100% CPU usage.

Are there any resources on optimising JavaScript to prevent this from happening (all I can find is how to turn off Firefox's long running script warning)?

William Hurst
  • 2,231
  • 5
  • 33
  • 54

7 Answers7

50

if you can turn your calculation algorithm into something which can be called iteratively, you could release control back the browser at frequent intervals by using setTimeout with a short timeout value.

For example, something like this...

function doCalculation()
{
   //do your thing for a short time

   //figure out how complete you are
   var percent_complete=....

   return percent_complete;
}

function pump()
{
   var percent_complete=doCalculation();

   //maybe update a progress meter here!

   //carry on pumping?
   if (percent_complete<100)
   {
      setTimeout(pump, 50);
   }
}

//start the calculation
pump();
Paul Dixon
  • 295,876
  • 54
  • 310
  • 348
  • 2
    Do you know if this would still work if you use `setTimeout(pump, 0)`? Or would this possibly keep pre-empting the browser code that responds to mouse input, updates the progress meter or other DOM elements? – Andy Feb 25 '15 at 18:18
  • @Andy Yes `setTimeout` with 0 will help too. See some of the answers to [this question](https://stackoverflow.com/questions/779379/why-is-settimeoutfn-0-sometimes-useful). – ShreevatsaR Mar 27 '17 at 18:34
9

Use timeouts.

By putting the content of your loop(s) into separate functions, and calling them from setTimeout() with a timeout of 50 or so, the javascript will yield control of the thread and come back some time later, allowing the UI to get a look-in.

There's a good workthrough here.

Phil H
  • 19,928
  • 7
  • 68
  • 105
4

I had blogged about in-browser performance some time ago, but let me summarize the ones related to the DOM for you here.

  • Update the DOM as infrequently as possible. Make your changes to in-memory DOM objects and append them only once to the DOM.
  • Use innerHTML. It's faster than DOM methods in most browsers.
  • Use event delegation instead of regular event handling.
  • Know which calls are expensive, and avoid them. For example, in jQuery, $("div.className") will be more expensive than $("#someId").

Then there are some related to JavaScript itself:

  • Loop as little as possible. If you have one function that collects DOM nodes, and another that processes them, you are looping twice. Instead, pass an anonymous function to the function that collects the nodes, and process the nodes as your are collecting them.
  • Use native functionality when possible. For example, forEach iterators.
  • Use setTimeout to let the browser breathe once in a while.
  • For expensive functions that have idempotent outputs, cache the results so that you don't have to recompute it.

There's some more on my blog (link above).

Rakesh Pai
  • 26,069
  • 3
  • 25
  • 31
3

This is still a little bit bleeding edge, but Firefox 3.5 has these things called Web Workers, I'm not sure about their support in other browsers though.

Mr. Resig has an article on them here: http://ejohn.org/blog/web-workers/

And the Simulated Annealing is probably the simplest example of it, if you'll notice the spinning Firefox logo does not freeze up, when the worker threads are doing their requests (thus not freezing the browser).

leeand00
  • 25,510
  • 39
  • 140
  • 297
  • Good point, well it certainly should be up top now. Using setTimeout sounds hacky to me no matter what your doing unless you really need a timeout for something. If you go to codepen.io on Chrome Windows right now and do an algorithm with say complexity O(N!) such as finding all permutations of string "ABCDEFGHIJKLMNOP" , your browser will lock up and become unresponsive. In a worker thread the UI keeps running on its own. This is clearly the correct answer. – AlphaG33k Jul 15 '15 at 02:11
  • I flagged this for moderation...we'll see what happens. – leeand00 Jul 17 '15 at 14:46
1

My experience is that DOM manipulation, especially in IE, is much more of an issue for performance than "core" JavaScript (looping, etc.).

If you are building nodes, it is much faster in IE to do so by building an HTML string and then setting innerHTML on a container than by using DOM methods like createElement/appendChild.

jhurshman
  • 5,861
  • 2
  • 26
  • 16
1

You can try performing long running calculations in threads (see JavaScript and Threads), although they aren't very portable.

You may also try using some Javascript profiler to find performance bottlenecks. Firebug supports profiling javascript.

Community
  • 1
  • 1
Eugene Morozov
  • 15,081
  • 3
  • 25
  • 32
0

You could try shortening the code by

 $(xmlDoc).find("Object").each(function(arg1) {
    (function(arg1_received) {
                setTimeout(function(arg1_received_reached) {

                    //your stuff with the arg1_received_reached goes here 

                }(arg1_received), 0)
            })(arg1)
}(this));

or for "for" loops try

for (var i = 0 ; i < 10000 ; i = i + 1) {
    (function(arg1_received) {
        setTimeout(function(arg1_received_reached) {

            //your stuff with the arg1_received_reached goes here 

        }(arg1_received), 0)
    })(arg1_to_send)
}

I had the same problem and my customers was reporting this as "Kill page" error. But now I juz got a best solution for that. :)

LINTUism
  • 440
  • 4
  • 13