1

This is a check to make sure I am not complicating my programs more than they need to be. I have some code that finds children nodes, filtered by tagName, and then grabs their data-attributes. This is all smooth.

However there are duplicates in the data-attributes, and so they must be filtered by unique values. Therefor I map through again and push to a newList as long as they are not already in there.

The problem i see is this creates a nested loop inside of a loop, probably slowing this down. Is there a faster way to do this?

Please note i am not asking for your opinion on what is best here. I just want to know if there are options that would be faster.

  var productList = Array.from(document.querySelectorAll(".grid li"))

  var _productList = productList.map(function (item) {
    var a = Array.from(item.children)
    var b = a.filter(item => item.tagName == "SPAN").map(item => item.dataset.colors)
    let newList = []
    b.map(i => !newList.includes(i) && newList.push(i))
    return newList
  })

  console.log(_productList)
// 0: (2) ["yellow", "white"]
// 1: ["gray"]
// 2: ["white"]
// 3: ["white"]
// 4: ["light_brown"]
// 5: (2) ["beige", "white"]
// 6: ["blue"]
// 7: (2) ["burgandy", "White"]
eat mangos
  • 23
  • 1
  • 4
  • it's better you use `b.forEach()` as you are not using the return value of the `map` – Addis Jan 08 '20 at 22:25
  • Thanks @Addis, yup. Silly mistake.. But the question still stands. This nested loop, forEach no map right, but this is standard practice? Or is there a faster way without that nested loop? – eat mangos Jan 08 '20 at 22:31
  • you can do it with out nested loop actually – Addis Jan 08 '20 at 22:33
  • 1
    Does this answer your question? [Get all unique values in a JavaScript array (remove duplicates)](https://stackoverflow.com/questions/1960473/get-all-unique-values-in-a-javascript-array-remove-duplicates) – kornieff Jan 08 '20 at 22:37
  • If anything is slowing this down, it's the DOM access. Better to have your data separate from your DOM so that you don't have to do this kind of stuff based on DOM queries. – Heretic Monkey Jan 08 '20 at 22:37

2 Answers2

1

One optimization could be to use Sets instead of creating custom logic to filter duplicates.

By doing so, you can give the control of filtering to the browser's JS engine, and it probably does it faster (or at least not slower):

  var productList = Array.from(document.querySelectorAll(".grid li"))

  var _productList = productList.map(item => [
    ...new Set(
      Array
        .from(item.children)
        .filter(item => item.tagName == "SPAN")
        .map(item => item.dataset.colors)
    )
  ])

  console.log(_productList)
FZs
  • 16,581
  • 13
  • 41
  • 50
0

Here's a solution that lets javascript do the work for you.

let values = ["1", "2", "2", "3", "3", "3"];
let distinctValues = {};
for (let value of values) {
  // Javascript's objects are actually maps, which inherently have unique keys.
  // Therefore we can place all values into the object and let javascript handle the de-duplication.
  distinctValues[value] = null;
}
for (let distinctValue in distinctValues) {
  // This only prints distinct values (1, 2, 3)
  console.log(distinctValue);
}
V Maharajh
  • 9,013
  • 5
  • 30
  • 31
  • 2
    Or just: `const distinctValues = new Set(values);`? – jarmod Jan 08 '20 at 23:16
  • 1
    @VivekMaharajh This solution require *two* loops; and it will also iterate over the object's proptotypes, and if there's any enemerable properties, it will list them as well. The OP asked for a *faster* way, but this may be even slower than theirs. – FZs Jan 09 '20 at 06:23
  • Despite being two loops, this is a linear O(n) solution since the loops aren't nested. The original code was O(n^2) since the array enumeration was happening inside of another array enumeration. – V Maharajh Jan 09 '20 at 22:44
  • Your answer to use Set is cleaner tho. I didn't realize that Set was so widely supported in browsers, so I'm glad OP went with that approach regardless :) – V Maharajh Jan 09 '20 at 22:55