Using element id's is the fastest way for javascript to 'get' an element. Is there a rule of thumb or best practices guideline on how many of these id's should be used before one can expect browser performance to start degrading?
-
Do you mean how many times can you call `document.getElementById` before stuff starts slowing down? – sdleihssirhc Apr 13 '11 at 23:24
-
Related: [how many div's can you have before the dom slows and becomes unstable?](http://stackoverflow.com/questions/2923524/how-many-divs-can-you-have-before-the-dom-slows-and-becomes-unstable) – harpo Apr 13 '11 at 23:36
-
I'll go with infinite. Try proving me wrong `:)` – Šime Vidas Apr 13 '11 at 23:41
-
I'm not worried about the number of calls, I don't see why that would make a difference. I'm concerned about the runtime for each individual call. Say you have 1000+ DOM elements with IDs attached to them, will this cause the getElementById call to suffer in performance? – Dave Lee Apr 14 '11 at 17:57
3 Answers
An ID, in and of itself, is just an attribute value. The only 'performance' issue is extra bits and bytes the browser has to download. From a JavaScript POV, the more elements in the DOM, the longer it can take to traverse it, but that's not directly related to the number of IDs you may be using.
EDIT:
To clarify if your JS is this:
document.getElementById("myID")
it doesn't matter if your HTML looks like this:
<div id="div1">
<div id="div2">
...
<div id="div999">
<div id="myDiv">
or this:
<div>
<div>
...
<div>
<div id="myDiv">
The JS should run the same for both of those examples.

- 39,848
- 49
- 150
- 213
-
So the performance of the JavaScript calls is only affect by the number of DOMs and not the number of DOMs that have IDs. Doesn't the browser create a hash or something for looking up DOM elements by ID? Will performance degrade as the size of this 'hash' increases or not? – Dave Lee Apr 14 '11 at 17:58
-
There's only one DOM. The DOM has nodes, each node being an HTML element. The more nodes, the longer it takes for javascript to crawl through it. An ID is just an attribute, so the number used doesn't have a direct correlation to JS performance. – DA. Apr 14 '11 at 21:06
A complex page means more bytes to download and it also means slower DOM access in JavaScript. It makes a difference if you loop through 500 or 5000 DOM elements on the page when you want to add an event handler for example.
A high number of DOM elements can be a symptom that there's something that should be improved with the markup of the page without necessarily removing content. Are you using nested tables for layout purposes? Are you throwing in more s only to fix layout issues? Maybe there's a better and more semantically correct way to do your markup.
A great help with layouts are the YUI CSS utilities: grids.css can help you with the overall layout, fonts.css and reset.css can help you strip away the browser's defaults formatting. This is a chance to start fresh and think about your markup, for example use s only when it makes sense semantically, and not because it renders a new line.
The number of DOM elements is easy to test, just type in Firebug's console: document.getElementsByTagName('*').length
-
Is there any accepted best practices on an upper limit you want to stay under in terms of DOM elements? I'm on a pretty fast machine so while everything runs very smoothly on my machine I am concerned about the experience of users on weaker devices, including mobile devices like the iPad. Is there any consensus on the load these mobile browsers can handle? – Dave Lee Apr 14 '11 at 18:01
-
the iPad is hardly a 'weak' device. The only rule of thumb commonly used is 'keep your page + assets under 100k' but that's just a rule of thumb and has more to do with traditional bandwidth issues. Many web applications are highly complex in terms of DOM structure these days. There is no hard-and-fast rule other than build, test, and retest. Repeat. – DA. Apr 14 '11 at 21:08
We've got a form with over 1,000 fields (don't ask), using jQuery Validate for client-side validation. This includes validating which fields are required, checking the data type of each field, showing/hiding groups of fields based on certain criteria and running calculations across multiple fields as data is entered.
Only MSIE slows down at this scale. Firefox and Chrome run the validation "instantly". MSIE eventually shows the "long running script" dialog. I was notified last night that additional fields are now required.

- 14,350
- 1
- 37
- 44
-
Are the references to those fields cached (inside variables/arrays/objects) or do you use `$('#id')` to lookup each field during the validation process? – Šime Vidas Apr 13 '11 at 23:52
-
@Šime Vidas - the form's elements collection already has a live collection of its controls. The type of validation required for a particular element is usually indicated by a class value, e.g. ** – RobG Apr 14 '11 at 02:44
-
@RobG Aha, then the slowness in IE can be attributed to its slow JavaScript engine in general. The only solution to improve the situation in IE is to optimize/re-factor the code. – Šime Vidas Apr 14 '11 at 11:04