9

On a web page I have a quite large list of items (say, product cards, each contains image and text) - about 1000 of them. I want to filter this list on client (only those items, which are not filtered away should be shown), but there is a rendering performance problem. I apply a very narrow filter and only 10-20 items remain, then cancel it (so all items have to be shown again), and browser (Chrome on very nice machine) hangs up for a second or two.

I re-render the list using following routine:

for (var i = 0, l = this.entries.length; i < l; i++) {
    $(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}

dict is the hash of allowed items' ids

This function itself runs instantly, it's rendering that hangs up. Is there a more optimal re-render method than changing "display" property of DOM elements?

Thanks for your answers in advance.

Alex Zaretsky
  • 93
  • 1
  • 1
  • 5
  • 1
    You're surprised that re-rendering of 1000 elements takes 1-2 seconds? Since I doubt 1000 elements are all visible at any moment, perhaps you should handle the visible items, and then work in the background to make the rest available (doing 50 per pass using setTimeout() between each batch to keep the browser alive). Or perhaps you should only rerender when they would actually become visible due to scrolling. It also isn't helping you to run 1000 separate selector operations that each has to search the entire DOM. – jfriend00 Apr 14 '12 at 05:49
  • Give us a jsFiddle to work on and I'm sure we could improve the switchover performance by a factor 10x. There's a lot of juicy fat in that code. – jfriend00 Apr 14 '12 at 05:52

3 Answers3

6

Why load 1000 items? First you should consider something like pagination. Showing around 30 items per page. that way, you are not loading that much.

then if you are really into that "loop a lot of items", consider using timeouts. here's a demo i had once that illustrates the consequences of looping. it blocks the UI and will cause the browser to lag, especially on long loops. but when using timers, you delay each iteration, allowing the browser to breathe once in a while and do something else before the next iteration starts.

another thing to note is that you should avoid repaints and reflows, which means avoid moving around elements and changing styles that often when it's not necessary. also, another tip is to remove from the DOM the nodes that are not actually visible. if you don't need to display something, remove it. why waste memory putting something that isn't actually seen?

Joseph
  • 117,725
  • 30
  • 181
  • 234
0

You can use the setTimeout trick that offloads the loop calls from the main thread and avoids the client freeze. I suspect that the total processing – from start to finish – would last the same amount of time, but at least this way the interface can still be used and the result is a better user experience:

for (var i = 0, l = this.entries.length; i < l; i++) {
  setTimeout(function(){
    $(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
  }, 0);
}
gyo
  • 1,663
  • 1
  • 19
  • 28
-1

Dude - the best way to handle "large amounts of DOM elements" is to NOT do it on the client, and/or DON'T use Javascript if you can avoid it.

If there's no better solution (and I'm sure there probably is!), then at LEAST partition your working set down to what you actually need to display at that moment (instead of the whole, big, honkin' enchilada!)

paulsm4
  • 114,292
  • 17
  • 138
  • 190
  • 6
    I downvoted your answer, because there's nothing necessarily wrong with handling large amounts of DOM elements on the client/using JavaScript. Average computers can handle DOM updates that involve tens of thousands (if not more) elements within a second, provided that you minimize complexity of algorithms, reflows, etc. In practise, this can often be much more responsive than a round trip to a server, especially because the client will still end up updating the DOM with whatever HTML the server provided. The problem is not in JavaScript, but in how it's used. – Hans Roerdinkholder Oct 30 '14 at 10:20
  • 3
    Saying "Don't use Javascript" in year 2014 for client side rendering is very poor. Especially when we have Google, Mozilla, Microsoft, etc. investing millions of dollars every year making WEB faster, better and free for the end user. Maybe that's why for latest trading apps, cloud solutions, music players, CAD & graphical applications, massive shops, (you name it) all you need is a browser that you can download for free. – Martin Oct 16 '15 at 10:03