0

I have an array of objects

const data = [{
    productId: 7000254,
    quantity: 1
}, {
    productId: 7000255,
    quantity: 1
}, {
    productId: 7000256,
    quantity: 1
}, {
    productId: 7000257,
    quantity: 1
}, {
    productId: 7000254,
    quantity: 1
}];

I need to get unique values from it using the reduce function.

I made it using below code

data.map((rp) => {
      if (products.map(({ productId }) => productId).indexOf(rp.productId) === -1) {
        products.push({ productId: parseInt(rp.productId), quantity: 1 })
      }
    })

but as you can see it's a lengthy process because I have to iterate over the array multiple times. So is there any way using reduce function?

var unique = data.reduce((a, b ,c,d) => {
  if (a.map(({productId}) => productId).indexOf(b.productId) === -1) {
    return [a,b]
  }
})
console.log(unique)

Expected output

0: {productId: 7000254, quantity: 1}
1: {productId: 7000255, quantity: 1}
2: {productId: 7000256, quantity: 1}
3: {productId: 7000257, quantity: 1}
Profer
  • 553
  • 8
  • 40
  • 81

3 Answers3

1

You can efficiently achieve this result using filter and Set.

const data = [{
    productId: 7000254,
    quantity: 1,
  },
  {
    productId: 7000255,
    quantity: 1,
  },
  {
    productId: 7000256,
    quantity: 1,
  },
  {
    productId: 7000257,
    quantity: 1,
  },
  {
    productId: 7000254,
    quantity: 1,
  },
];

const set = new Set();
const result = data.filter((o) => {
  if (set.has(o.productId)) return false;
  set.add(o.productId);
  return true;
});

console.log(result);
/* This is not a part of answer. It is just to give the output fill height. So IGNORE IT */

.as-console-wrapper {
  max-height: 100% !important;
  top: 0;
}
DecPK
  • 24,537
  • 6
  • 26
  • 42
  • Jeah, I would choose this over reduce. It's easier to read. – Totati Sep 01 '21 at 06:48
  • @Totati Time complexity is also O(n) as compared to `reduce` if used with `find`. – DecPK Sep 01 '21 at 06:49
  • You coudl use Set with reduce too, but I'd still choose filter :D You coudl use Map too, but that would mean 2 iterations, as at the end you need to map the values back to an array. – Totati Sep 01 '21 at 06:57
  • @HR01M8055 please mind that you're doing a `set.has` on every item of data, so that's O(n*m) ~ O(n^2) . That doesn't reduce the quality of your answer, it is still a very elegant one. – malarres Sep 01 '21 at 07:01
  • 1
    @malarres `set.has` Will gives the result in constant time `O(1)`. – DecPK Sep 01 '21 at 07:04
  • 1
    nice to know @HR01M8055, I got from the spec that it required to be `O(n)` "as much" but I didn't know that it was implemented as `O(1)` source: `Set objects must be implemented using either hash tables or other mechanisms that, on average, provide access times that are sublinear on the number of elements in the collection.` https://262.ecma-international.org/6.0/#sec-set-objects – malarres Sep 01 '21 at 07:12
  • @malarres You are right and came to know more about Set [here on stackoverflow](https://stackoverflow.com/questions/55057200/is-the-set-has-method-o1-and-array-indexof-on) – DecPK Sep 01 '21 at 07:53
0

Array.reduce implementation.

Logic

  • Loop through the array.
  • Before pushing that to accumulator, verify there is aleady a node present in the accumulator.
  • If a node exist, dont push, or else push it to accumulator.

const data = [{
  productId: 7000254,
  quantity: 1
}, {
  productId: 7000255,
  quantity: 1
}, {
  productId: 7000256,
  quantity: 1
}, {
  productId: 7000257,
  quantity: 1
}, {
  productId: 7000254,
  quantity: 1
}];
const output = data.reduce((acc, curr) => {
  const matchingNode = acc.find(node => node.productId === curr.productId);
  if(!matchingNode) {
    acc.push(curr);
  }
  return acc;
}, []);
console.log(output)
Nitheesh
  • 19,238
  • 3
  • 22
  • 49
  • `find` inside `reduce` makes its worst time complexity to `n * 2`. You could have used `Set` here to make it `O(n)` – DecPK Sep 01 '21 at 06:48
  • @HR01M8055 Question is regarding functionality, not regarding performace. – Nitheesh Sep 01 '21 at 06:54
  • 1
    I totally agree with you. But If we get functionality with performance then that will be even better. What is the use of the functionality if it is not efficient. Don’t you think – DecPK Sep 01 '21 at 07:01
  • 1
    @HR01M8055 Agree. We have a lot of performace optimizations. One is provided by you. Do I ned to find all he performace improvement as fix them? – Nitheesh Sep 01 '21 at 07:09
0

Best way to do this without iterating multiple times is to use reduce something like this

const output = data.reduce((pv, cv) => (
  pv.array.indexOf(cv.productId) === -1) // new value
    ? {
        array: [...pv.array, cv.productId],
        output: [...pv.output, cv]
      }
    : pv
), { array: [], output: [] })

console.log({ output })
Greg K
  • 10,770
  • 10
  • 45
  • 62
Profer
  • 553
  • 8
  • 40
  • 81