9

I have a page with chained drop-downs. Choosing an option from the first select populates the second, and choosing an option from the second select returns a table of matching results using the innerHtml function on an empty div on the page.

The problem is, once I've made my selections and a considerable amount of data is brought onto the page, all subsequent Javascript on the page runs exceptionally slowly. It seems as if all the data I pulled back via AJAX to populate the div is still hogging a lot of memory. I tried setting the return object which contains the AJAX results to null after calling innerHtml but with no luck.

Firefox, Safari, Chrome and Opera all show no performance degradation when I use Javascript to insert a lot of data into the DOM, but in IE it is very apparent. To test that it's a Javascript/DOM issue rather than a plain old IE issue, I created a version of the page that returns all the results on the initial load, rather than via AJAX/Javascript, and found IE had no performance problems.

FYI, I'm using jQuery's jQuery.get method to execute the AJAX call.

EDIT This is what I'm doing:

<script type="text/javascript">
function onFinalSelection() {
  var searchParameter = jQuery("#second-select").val();
  jQuery.get("pageReturningAjax.php",
  {SEARCH_PARAMETER: searchParameter},
  function(data) {
    jQuery("#result-div").get(0).innerHtml = data;
   //jQuery("#result-div").html(data); //Tried this, same problem
    data = null;
  },
  "html");
}
</script>

I want to note that this only becomes an issue when the return data is quite large. It is directly related to the size, as I am able to see moderate slowdown for medium size results and only major slowdown when it is a few hundred records + being returned.

aw crud
  • 8,791
  • 19
  • 71
  • 115

3 Answers3

17

You can force garbage collection in IE by using the CollectGarbage function, e.g.

if (typeof(CollectGarbage) == "function")
    CollectGarbage();

The JScript garbage collector is described in detail in this blog entry: http://blogs.msdn.com/ericlippert/archive/2003/09/17/53038.aspx

As the blog says, the GC is not predictable, so delete data or data = null will not reclaim the memory immediately, but it eventually will reclaim it.


But I doubt that your performance penalty is really caused by the memory usage; I think that it is a problem with DOM rendering.

ignaZ
  • 530
  • 2
  • 14
  • do you know if there is any similar method to CollectGarbage that works in webkit browsers? – xus Apr 19 '12 at 11:40
  • do you know if this needs to be added to each javascript file in my application? – Ayusman Apr 30 '12 at 20:12
  • I'm not completely sure if this holds true with JS, but in other languages, when Garbage Collection is forced it will push all surviving objects into the next generation - which can be incredibly detrimental if you end up pushing lots of transient objects into later generations. – xelco52 Aug 23 '12 at 18:07
  • in IE11 the memory usage leak is a known issue. We will have to keep dealing with it since it will never have a fix applied seems like. Modern browsers are fine. – Coty Embry Oct 02 '20 at 13:55
7

Use

$("#result-div").html(data);

html() utilizes jQuery's empty method which works very hard to prevent memory leaks.

have you tried:

delete data;

I'm thinking there are other performance issues in your code causing the sluggishness. Is your return data using png's with alpha transparency? I've seen that kill IE6 (when the alpha filter is applied) and slow down IE7 considerably.

David Murdoch
  • 87,823
  • 39
  • 148
  • 191
  • No luck with using the html function or delete. I'm just returning a table of text (many rows, but just text) – aw crud May 03 '10 at 19:07
  • Are you binding events to the rows, cells, text inside the cells? How much data are you inserting? – David Murdoch May 03 '10 at 19:09
  • There are upwards of 2000 rows, for a total of ~250kb. I'm only binding a single event on the DIV that contains the entire resultset. – aw crud May 03 '10 at 19:19
  • so, you really need 2000 rows imported at the same time? can you important them in chunks/pages? – David Murdoch May 03 '10 at 19:42
  • The query is quite complicated and takes several seconds to run whether I return 10 rows or 1000 rows. Paging is not really feasible as a result. The nature of the data is that a user will want to scroll through it quickly to locate notable entries, so even if saving the results in memory and doing paging this way worked, it would break up the flow. – aw crud May 04 '10 at 13:44
  • If you don't append the 2000 rows and just keep it cached as a Javascript object does the slowdown still occur? – David Murdoch May 04 '10 at 13:58
3

If somebody interested not only in IE:

To force garbage collection in Gecko:

window.QueryInterface(Components.interfaces.nsIInterfaceRequestor)
  .getInterface(Components.interfaces.nsIDOMWindowUtils)
  .garbageCollect();

Link

Max
  • 1,090
  • 10
  • 23