2

I have this code

$.getJSON( "https://domain.ltd/parse_data.php", function( data_recieved ) {
  if (data_recieved.length) {   
    $.each(data_recieved, function(index, element) {
      $( ".items" ).append( '<span>' + element.name +  ' = ' + element.amount + '</span><br />' );
    });
  }
})

As you can see, it's parsing json and displaying results with append. However, if there are 500 rows of data in the response, it can take up to 30 seconds to append all 500 lines. And while it's happening, the website is unresponsive.

Not only that, my CPU usage goes to 50%.

Am I doing it wrong? Maybe there is a more efficient way to parse through this much data and display it dynamicaly with jQuery?

Steven
  • 167
  • 8
  • 1
    [This should be instant.](https://jsfiddle.net/oLxjup9q/) – Jeto Jul 24 '20 at 08:41
  • @Jeto I'm not sure if I'm getting it right. How would I know the number of lines in advance? Maybe you could elabore a little bit in an answer? – Steven Jul 24 '20 at 08:49
  • That fiddle was just to showcase that appending 500 lines using jQuery should not be any problem. So whatever's taking so much time/processing power, it's probably not it. – Jeto Jul 24 '20 at 08:51
  • 1
    why not concatenate all spans in one variable & then call the `.append()` only once outside the loop?.. that spare the DOM traversal in the loop. – techie_28 Jul 24 '20 at 08:53
  • @Jeto Well, the json page is opening in less than a second and then jQuery works up to 30 seconds to parse trough it. There are way more html and data points in my real code, maybe that's why it is not as quick as the simplified version. – Steven Jul 24 '20 at 08:59

2 Answers2

2

I believe this to be a better solution

$.getJSON( "https://domain.ltd/parse_data.php", function( data_recieved ) {
  if (data_recieved.length) {   
    var spns = '';
    $.each(data_recieved, function(index, element) {
spns+='<span>' + element.name +  ' = ' + element.amount + '</span><br />'; 
    });
        $( ".items" ).append(spns); // or use .html();
  }
})

It seems like your DOM tree is deep & $( ".items" ) inside the loop is getting expensive.

techie_28
  • 2,123
  • 4
  • 41
  • 62
  • Thank you, this did the trick. Now it works 10 times faster! – Steven Jul 24 '20 at 09:59
  • Correction: actually it is working hundreds times faster. It takes 30-40 ms to go through 900 lines of data. Now the only thing that takes time is json generating script on the server side :) – Steven Jul 24 '20 at 11:48
1

You could improve your code so that the performance is better. At the bottom of the code I have applied and described a few tips. You can see the used time in the developer console of this page.

// just for simulating your JSON
var dataRecieved = [];

for (var i = 0; i < 500; i++) {
  dataRecieved.push({ name: 'Element ' + i, amount: parseInt(Math.random() * i) });
}

// $.getJSON( "https://domain.ltd/parse_data.php", function( data_recieved ) {
// optimization start here
console.time('test');

// use a simple for loop and save the length of received data
var element, dataReceivedLength = dataRecieved.length;

// create a variable and append all html to it
var dataMarkup = '';

// if you want to do it on only one element, use the id as selector.
// save the object to a variable
var $items = $('#first-collection');

// check if $items exist
if ($items.length && dataReceivedLength) {
  for (var i = 0; i < dataReceivedLength; i++) {
    element = dataRecieved[i];
    dataMarkup += '<span>' + element.name + ' = ' + element.amount + '</span><br />';
  }
  
  // use html() instead of append() for performance reasons in this case
  $items.html(dataMarkup);
}

console.timeLog('test');
// });
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>

<p id="first-collection" class="items"></p>
UfguFugullu
  • 2,107
  • 13
  • 18
  • Thank you, but I chose the @techie_28's answer because it goes without `for`. Is there any benefit in adding `for` loop here? – Steven Jul 24 '20 at 10:00
  • @Steven That's no problem. The `for` loop has a better performance as you can see in this post: [$.each() vs for() loop - and performance](https://stackoverflow.com/questions/11887450/each-vs-for-loop-and-performance) – UfguFugullu Jul 24 '20 at 10:41
  • Oh, wow. I'll try your method and will report back my results. – Steven Jul 24 '20 at 11:01
  • I've decided to not use `for` loop because when I checked the current solution with the timelog function, it showed 30-40 ms for 900 lines of data. The json file compilation is now the slowest part of my script (it takes 1200ms). Thank you for such a beautifully written answer anyway! – Steven Jul 24 '20 at 11:15