61

I'm using Javascript to parse an XML file with about 3,500 elements. I'm using a jQuery "each" function, but I could use any form of loop.
The problem is that the browser freezes for a few seconds while the loop executes. What's the best way to stop freezing the browser without slowing the code down too much?

$(xmlDoc).find("Object").each(function() {
    //Processing here
});
Chris B
  • 15,524
  • 5
  • 33
  • 40
  • 4
    Get a faster language! No, really: unless it's absolutely necessary, don't use JS for this - as you see, it's 1) single-threaded and 2) slow. – Piskvor left the building Apr 03 '09 at 20:34
  • 6
    This is a client-side function, and JS is necessary. – Chris B Apr 20 '09 at 20:04
  • 12
    @Triptych - And his options are? Certainly one would hope that much heavy lifting like this could be performed server side, but since we don't know his situation it's best to assume that he has good reason for doing it client-side, and when working client side in a web app, you only really have a choice between Javascript and, well... Javascript. – Toji Aug 27 '09 at 18:11
  • 1
    Best practices for heavy computation in JavaScript http://stackoverflow.com/q/13947592/821057 – Zaheer Ahmed Sep 24 '13 at 08:21

10 Answers10

73

I would ditch the "each" function in favour of a for loop since it is faster. I would also add some waits using the "setTimeout" but only every so often and only if needed. You don't want to wait for 5ms each time because then processing 3500 records would take approx 17.5 seconds.

Below is an example using a for loop that processes 100 records (you can tweak that) at 5 ms intervals which gives a 175 ms overhead.

var xmlElements = $(xmlDoc).find('Object');
var length = xmlElements.length;
var index = 0;
var process = function() {
  for (; index < length; index++) {
    var toProcess = xmlElements[index];
    // Perform xml processing
    if (index + 1 < length && index % 100 == 0) {
        setTimeout(process, 5);
    }
  }
};
process();

I would also benchmark the different parts of the xml processing to see if there is a bottleneck somewhere that may be fixed. You can benchmark in firefox using firebug's profiler and by writing out to the console like this:

// start benchmark
var t = new Date();
// some xml processing
console.log("Time to process: " + new Date() - t + "ms");

Hope this helps.

Helgi
  • 1,577
  • 9
  • 9
  • 6
    This was a great idea - use the setTimeout periodically. It works with a timeout of 0. – Chris B Apr 22 '09 at 20:38
  • I've done exactly this for several web apps that required massive data processing on the client end. Works like a charm, even if it does require a bit of restructuring. – Toji Aug 27 '09 at 18:12
  • 12
    Cool code. Maybe I'm missing something, but I had to add a `index++` and a `break` after the setTimeout() in order to get this to work. – D.Tate Apr 12 '13 at 22:53
  • I don't know If I am missing something, but my code when molded according to what you have provided here is going in infinite loop. – Rishi Nov 13 '15 at 09:16
  • 2
    better using console.time() and console.timeEnd() for benchmarking – SalmanShariati Feb 03 '16 at 20:49
23

Set a timeOut between processing to prevent the loop cycle from eating up all the browser resources. In total it would only take a few seconds to process and loop through everything, not unreasonable for 3,500 elements.

var xmlElements = $(xmlDoc).find('Object');

var processing = function() {
  var element = xmlElements.shift();

  //process element;

  if (xmlElements.length > 0) {
    setTimeout(processing, 5);
  }
}

processing();
TJ L
  • 23,914
  • 7
  • 59
  • 77
  • I decided on this method, except I only run the setTimeout every 50 elements. And yes, it works with a timeout of 0. – Chris B Apr 20 '09 at 20:05
  • @Christoph - you don't need to pass any timeout interval '0' since the default is `0`. – vsync Jun 24 '18 at 19:03
6

I'd consider converting the 3500 elements from xml to JSON serverside or even better upload it to server converted, so that it's native to JS from the getgo.

This would minimize your load and prolly make the file size smaller too.

Mikko Tapionlinna
  • 1,450
  • 1
  • 12
  • 21
3

Javascript is single-threaded, so aside from setTimeout, there's not much you can do. If using Google Gears is an option for your site, they provide the ability to run javascript in a true background thread.

Gabe Moothart
  • 31,211
  • 14
  • 77
  • 99
3

you can setTimeout() with duration of ZERO and it will yield as desired

Scott Evernden
  • 39,136
  • 15
  • 78
  • 84
3

Long loops without freezing the browser is possible with the Turboid framework. With it, you can write code like:

loop(function(){  
        // Do something...  
}, number_of_iterations, number_of_milliseconds);

More details in this turboid.net article: Real loops in Javascript

Flimm
  • 136,138
  • 45
  • 251
  • 267
2

You could use the HTML5 workers API, but that will only work on Firefox 3.1 and Safari 4 betas atm.

olliej
  • 35,755
  • 9
  • 58
  • 55
1

I had the same problem which was happening when user refreshed the page successively. The reason was two nested for loops which happened more than 52000 times. This problem was harsher in Firefox 24 than in Chrome 29 since Firefox would crash sooner (around 2000 ms sooner than Chrome). What I simply did and it worked was that I user "for" loops instead of each and then I refactored the code so that I divided the whole loop array to 4 separated calls and then merged the result into one. This solution has proven that it has worked.

Something like this:

var entittiesToLoop = ["..."]; // Mainly a big array
   loopForSubset(0, firstInterval);
   loopForSubset(firstInterval, secondInterval);
    ...

var loopForSubset = function (startIndex, endIndex) {
    for (var i=startIndex; i < endIndex; i++) {
            //Do your stuff as usual here
    }
}

The other solution which also worked for me was the same solution implemented with Worker APIs from HTML5. Use the same concept in workers as they avoid your browser to be frozen because they run in the background of your main thread. If just applying this with Workers API did not work, place each of instances of loopForSubset in different workers and merge the result inside the main caller of Worker.

I mean this might not be perfect but this has worked. I can help with more real code chunks, if someone still thinks this might suite them.

FidEliO
  • 875
  • 3
  • 14
  • 24
1

You could try shortening the code by

   $(xmlDoc).find("Object").each(function(arg1) {
    (function(arg1_received) {
                setTimeout(function(arg1_received_reached) {

                    //your stuff with the arg1_received_reached goes here 

                }(arg1_received), 0)
            })(arg1)
}(this));

This won't harm you much ;)

LINTUism
  • 440
  • 4
  • 13
0

As a modification of @tj111 answer the full usable code

    //add pop and shift functions to jQuery library. put in somewhere in your code.
    //pop function is now used here but you can use it in other parts of your code.
    (function( $ ) {
        $.fn.pop = function() {
            var top = this.get(-1);
            this.splice(this.length-1,1);
            return top;
        };

        $.fn.shift = function() {
            var bottom = this.get(0);
            this.splice(0,1);
            return bottom;
        };
    })( jQuery );


//the core of the code:
    var $div = $('body').find('div');//.each();
    var s= $div.length;
    var mIndex = 0;
    var process = function() {
        var $div = $div.first();            
    //here your own code.

    //progress bar:
        mIndex++;
    // e.g.:    progressBar(mIndex/s*100.,$pb0);

    //start new iteration.
        $div.shift();
        if($div.size()>0){
            setTimeout(process, 5);
        } else {
    //when calculations are finished.
            console.log('finished');
        }
    }
    process();
Vyacheslav
  • 26,359
  • 19
  • 112
  • 194