5

After reading about optimally-performing JavaScript, I learned it's best to minimize interaction with the DOM, and rely more on the logic within JavaScript itself.

Part of this is using a single event listener for a list of elements and reading the click target, rather than an event listener for each and every one

ul#menu
  li#one One
  li#two Two
  li#three Three

$ul = document.getElementById('#menu')
$ul.addEventListener('click', function(e) {
  if (e.target.id == ...) { ... } // You get the idea
})

I have a website that has several navbars, several dropdown buttons and such. So why not create a single event listener on the body and just look at the event target?

document.body.addEventListener('click', function(e) {
  if (e.target.id == ...) { ... };
})

I can't tell what's wrong with this solution from a technical perspective. Something just seems too heavy to me, it has a code smell to it.

Would this be problematic on mobile? Does the nature of <body> being high in the DOM end up doing more damage to performance than good?

I'm not using jQuery, please suggest pure JS solutions.

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Adam Grant
  • 12,477
  • 10
  • 58
  • 65
  • 1
    Every click that happens on your page will go through that listener. That's the part that people don't like about this method. As long as you keep it sensible, it won't be an issue, but it can quickly get out of hand as your app grows. – Kevin B Oct 30 '14 at 17:13
  • 2
    One thing to note is that using this approach, there's no concept of propagation. Clicking an inner element doesn't "bubble" up to a parent, since you're checking the `e.target.id` only. – Ian Oct 30 '14 at 17:14
  • Thanks, but out of hand how exactly? – Adam Grant Oct 30 '14 at 17:14
  • 6
    Currently, you have one if condition. Later you may have 50. That would mean that for every click NOT on one of your target elements, you would have to go through all 50 if conditions just to realize that nothing needs to happen. To be clear, jQuery event delegation suffers from this same issue, if not used properly. – Kevin B Oct 30 '14 at 17:14
  • Also, the pure JS solution is to not worry about something like this. While DOM interaction *could* be heavy, there's nothing slow about event binding. If you wanted an alternative, you could look into how jQuery handles event bindings - it binds one native event to an element, and keeps track of all callbacks that need to be called, manually calling them when the native event is triggered (if applicable). – Ian Oct 30 '14 at 17:18
  • @KevinB I think that's a good point. Instead of letting the JS fail to find code to execute, let the default behavior of the native event bindings do it for you. – Ian Oct 30 '14 at 17:19
  • One way is to add a single listener to the parent element of all these navbars (if it's `body`, then wrap them with `div`). Then create an object with methods named according to responsive elements' `id`s, or rather for example `data-click`s. This way you need to check only, if the object has a method, and then call that method if it exists. – Teemu Oct 30 '14 at 17:33
  • There's a convenience of my particular implementation where all said elements have the same class and need only to toggle an additional one (".open") But perhaps it's better to discuss this in the abstract for the better of the community. – Adam Grant Oct 30 '14 at 17:40
  • Can someone post an answer? I think this is beneficial to other devs looking to optimize their scripts without pitfalls. – Adam Grant Oct 30 '14 at 17:57
  • The problem is that people will argue both ways. It comes down to the value an individual places on convenience vs performance, and an answer doesn't apply equally to every situation, as you alluded to in your previous comment where there's a particular convenience to it in your situation. It's a good question, but it's more philosophical in nature. *Maybe* if you rephrased it as something like *"what are the arguments for and against..."*. That's a little more concrete, but I'm still not sure if it's a good fit. –  Oct 30 '14 at 18:14
  • If there are considerations to be made on both sides, an explanation of those considerations would be an answer. – Adam Grant Oct 30 '14 at 21:56
  • 1
    See also [Should all jquery events be bound to $(document)?](http://stackoverflow.com/q/12824549/1048572) and [Why not take Javascript event delegation to the extreme?](http://stackoverflow.com/q/9711118/1048572) – Bergi Dec 11 '14 at 02:04
  • Possible duplicate of [Why not take Javascript event delegation to the extreme?](https://stackoverflow.com/questions/9711118/why-not-take-javascript-event-delegation-to-the-extreme) – machineghost May 17 '18 at 21:06

2 Answers2

4

After reading about optimally-performing JavaScript, I learned it's best to minimize interaction with the DOM, and rely more on the logic within JavaScript itself.

I'm not sure what this means. Whatever it does, did you also read the part about when to worry about optimization (later, or never, or when you really have to)?

Part of this is using a single event listener for a list of elements and reading the click target, rather than an event listener for each and every one

This is a model which is best applied in cases such as where you have a hundred <li> elements to listen to events on, so rather than attaching event handlers to each and every one, you attach one event handler to the <ul>. Personally, I'm not convinced that there's such a major benefit from doing this, but in any case that is the logic.

There are two reasons why this could be beneficial:

  1. There is only one event handler occupying memory, instead of 100. In this day and age, that is not too convincing.

  2. When new elements are added, there is no need to explicitly add event handlers to them, since an event handler is already in place on their ancestor. This could indeed make program logic simpler.

However, extending that to putting one master event handler on the body is going way too far. As one commenter mentioned, the logic in that event handler is going to end up being a massive pile of spaghetti.

A good basic rule is to put event handlers on the element involved, or on a nearby ancestor parent element to handle events in the same or similar ways on multiple children/descendants.

So the answer to your question of "is it crazy" is, yes.

  • 1
    While I agree with your position, it's still a matter of opinion. I saw @Pointy mention the other day that he delegates all handlers to the `body` *(using jQuery)*, and he's certainly a knowledgeable guy. It isn't a given that the logic will end up as *"a massive pile of spaghetti"*. A single delegate doesn't preclude breaking things out into separate functions. So I think your answer is a little too absolute. –  Oct 30 '14 at 18:24
  • Well, opinions are like, what was it again? –  Oct 30 '14 at 18:34
  • 1
    One event handler sounds like low cohesion/high coupling to me. I'd avoid it unless there's a good reason not to. – 1983 Oct 30 '14 at 18:44
  • 1
    I guess there also is quite a difference between 1 event handler that does the same for all descendants, 1 event handler on the body that does 50 things, and 50 event handlers on the body that do one thing each. – Bergi Dec 11 '14 at 01:58
1

Why would you ever want to optimize something like that? Compared with the memory footprint of a single background image or the truckload of JQuery thingies your application will never use, that's peanuts. As for CPU usage, that would be measured in microseconds per hour. I very doubt your main performance issue will come from there.

Factorizing code is just the same story anywhere, be it in a JavaScript event handler , an OpenGL shader or a c++ template. You do it when it proves more useful or convenient than writing slight variations of the same code all over again.

You might also want to do it because your boss told you so or some influential jerks named it "good practice", though.

kuroi neko
  • 8,479
  • 1
  • 19
  • 43