4

I am developing a small web-utility that displays some data from some database tables.

I have the utility running fine on FF, Safari, Chrome..., but the memory management on IE8 is horrendous. The largest JSON request will return information to create around 5,000 or so rows in a table within the browser (3 columns in the table).

I'm using jQuery to get the data (via getJSON). To remove the old/existing table, I'm just doing a $('#my_table_tbody').empty(). To add the new info to the table, within the getJSON callback, I am just appending each table row that I am creating to a variable, and then once I have them all, I am using $('#my_table_tbody').append(myVar) to add it to the existing tbody. I don't add the table rows as they are created because that seems to be a lot slower than just adding them all at once.

Does anyone have any recommendation on what someone should do who is trying to add thousands of rows of data to the DOM? I would like to stay away from pagination, but I'm wondering if I don't have a choice.

Update 1 So here is the code I was trying after the innerHTML suggestion:

/* Assuming a div called 'main_area' holds the table */
document.getElementById('main_area').innerHTML = '';

$.getJSON("my_server", {my: JSON, args: are, in: here}, function(j) {
   var mylength = j.length;
   var k =0;
   var tmpText = '';
   tmpText += /* Add the table, thead stuff, and tbody tags here */;
   for (k = mylength - 1; k >= 0; k--)
   {
      /* stack overflow wont let me type greater than & less than signs here, so just assume that they are there. */
      tmpText += 'tr class="' + j[k].row_class . '"   td class="col1_class" ' + j[k].col1 + ' /td td class="col2_class" ' + j[k].col2 + ' /td  td class="col3_class" ' + j[k].col3 + ' /td /tr';
   }

   document.getElementById('main_area').innerHTML = tmpText;
}

That is the gist of it. I've also tried using just a $.get request, and having the server send the formatted HTML, and just setting that in the innerHTML (i.e. document.getElementById('main_area').innerHTML = j;).

Thanks for all of the replies. I'm floored with the fact that you all are willing to help.

John Hartsock
  • 85,422
  • 23
  • 131
  • 146
okie.floyd
  • 231
  • 5
  • 17
  • 1
    Need to see the code you're running to optimize anything... Document fragments and caching via jQuery are probably what you want here. – Nick Craver Mar 31 '10 at 17:28
  • document fragments via jQuery is not necessary anymore since 1.4 ;) However, you can try to split the json into 10-20 pieces and use a small delay (5-10ms) between inserts. – Ionuț Staicu Mar 31 '10 at 17:59
  • it is not the bulk of JSON that is causing the problems. it is adding the formatted html into IE. ~1 megabyte of JSON data should not result in a 100MB footprint of memory. – okie.floyd Mar 31 '10 at 19:05

5 Answers5

2
     var tmpText = [];
    for (k = mylength - 1; k >= 0; k--)
       {
          /* stack overflow wont let me type greater than & less than signs here, so just assume that they are there. */
    tmpText.push('anything you want')
          tmpText.push( 'tr class="' + j[k].row_class . '"   td class="col1_class" ' + j[k].col1 + ' /td td class="col2_class" ' + j[k].col2 + ' /td  td class="col3_class" ' + j[k].col3 + ' /td /tr';)
       }
    $('#main_area').html(tmpText.join(''))

    }

you dont need document.getElementById('main_area').innerHTML = '';

this method is to push into array, then join and use jquery html function to update. This is the fastest method I know. Sorry for the format here - its my first post and I thought I'd give something back here to stackoverflow.

Curtis
  • 101,612
  • 66
  • 270
  • 352
Richard
  • 21
  • 2
0

To get IE to respond quickly you should be creating your table rows as string representations of HTML , appending them to a string variable, and then adding the result to your table's like this.

myTable.myTbody.innerHTML = allThoseRowsAsAString;

It's not a memory issue: 5,000 rows should be trivial. That's got to be far less than one megabyte.

Robusto
  • 31,447
  • 8
  • 56
  • 77
  • Actually repeatedly appending to a string is not really a good idea in IE. (IE8 may be better than IE6 and 7 were.) – Pointy Mar 31 '10 at 17:43
  • If you want to avoid memory leaks, concatenating large strings is about the worst way to go about it. – Nick Craver Mar 31 '10 at 17:50
  • 2
    I'm just saying string concatenation is the fastest way to add lots of markup to a page, especially in IE. Read http://stackoverflow.com/questions/112158/javascript-string-concatenation thread, where the idea of string-concatenation-as-bottleneck is dismissed. – Robusto Mar 31 '10 at 17:57
  • 1
    IE8 has some bugs with innerHTML. when i try to do this, it gives me an error. – okie.floyd Mar 31 '10 at 18:01
  • i got the innerHTML to work, but it did not solve any of my memory issues. uncompressed, we are talking about roughly 1.5M of JSON data (apache gets it down to ~85k in the request). 1.5M of JSON is still causing a 100M spike in memory. – okie.floyd Mar 31 '10 at 18:13
  • 1
    Have you installed dynatrace and done some investigation into where the time is going? – Pointy Mar 31 '10 at 18:28
0

Robusto is right about innerHTML assignment being a better way to go. IE sucks at dynamic DOM creation

Why not form your innerHTML on the server using a jsp and stream it back via ajax in one shot. It will definitely speed things up, remove complexity from your javascript and delegate markup creation to its proper place.

plodder
  • 2,304
  • 18
  • 19
  • i have tried forming the html on the server and sending it for insertion in one chunk, but granted i did not try it inserting it via innerHTML. i will try and let you know how it works (i am using php on the server side, btw). – okie.floyd Mar 31 '10 at 18:39
  • whoever gave the negative vote, would you please explain why? – plodder Mar 31 '10 at 18:57
  • twasnt me. i need all of the help i can get, and i certainly wouldnt ding someone for trying. – okie.floyd Mar 31 '10 at 19:05
0

As Plodder said, IE has big problems when working with DOM. jQuery best practices recommends creating code on a simple string and appending just once inside the container.

Beside this, I recently had a similar problem for a hierarchycal data, having an amount of 5,000 data records. I asked myself: did the user really need all that information available at a given moment? Then I realized the best I could do was just present a "first chunk of data" and then insert more data on user demand.

Finally, just one good tool: Dynatrace Ajax (it helps a lot to find the javascript function it takes more time to operate)

Esteve Camps
  • 1,113
  • 1
  • 10
  • 20
0

Since you are dealing with thousands of data rows I wouldn't call $('#my_table_tbody').empty() and add the new data with new DOM elements. Instead I'd follow the Object Pool Pattern. Thus instead of dropping all the tr's you can reuse existing ones and just populate with the new data.

If your new data set has less rows then the previous one, remove the rest of the rows from the DOM, but keep references to them in some pool so that garbage collector won't destroy them. If your new data set is bigger - just create new tr's on demand.

You can look at the implementation of YUI DataTable, here's the source. IIRC they use this approach to speed up the render time.

Ihor Kaharlichenko
  • 5,944
  • 1
  • 26
  • 32