-4

I have a problem where I need to filter an array of duplicate values. EG: [1,2,3,3,3,4] -> [1,2,3,4]

Currently I've written the following code which works but I don't think there is enough redundancy.

const deduper = (arrayToDedupe) =>
  Object.values(
    Object.assign({}, [
      ...new Set(
        [...new Set(arrayToDedupe)]
          .filter((element, index, array) => index === array.indexOf(element))
          .reduce((acc, elementv2) => {
            if (acc.includes(elementv2)) {
              return acc;
            } else {
              acc.push(elementv2);
              return acc;
            }
          }, [])
          .map((elementv3, indexv3, arrayv3) => {
            if (indexv3 === arrayv3.indexOf(elementv3)) {
              return elementv3;
            } else {
              return undefined;
            }
          })
          .filter((x) => x)
      ),
    ])
  )
    .sort()
    .map((element, index, array) => {
      if (array[index + 1] === element) return undefined;
      return element;
    })
    .filter((x) => x);

Is there a way to really really really ensure without a doubt that the returned array will not have any duplicates. Do I need to add more chained array methods?

memerson
  • 9
  • 2
  • do you have an example of data which you like to process? – Nina Scholz Aug 06 '20 at 20:17
  • 1
    put them into `Set`and then back to `Array` ;) And you're done (this applies on numbers and basic data types, not including objects) – Dominik Matis Aug 06 '20 at 20:19
  • 1
    For the data set you can think of it being an array like the following: `[1,2,3,3,3,4,5]` it could also be strings like: `['never', 'rick', 'you', 'you', 'give', 'give', 'gonna', 'up' ]` – memerson Aug 06 '20 at 20:26
  • 1
    The point of having any *'redundancy'* at all is not clear. What is your concern, exactly? Which kind of input data requires spaghettizing your code that much? – Yevhen Horbunkov Aug 06 '20 at 20:27
  • @memerson : the answer that was given addresses both of those use cases perfectly. – Yevhen Horbunkov Aug 06 '20 at 20:28
  • The redundancy in this case is to go against all code quality ideas. it is to redundantly loop through the array almost `ad nauseum` with every possible filtering method for duplicate elements. – memerson Aug 06 '20 at 20:38
  • please add some data, you are takling about. otherwise it is a duplicate of just getting an array with unique items. – Nina Scholz Aug 06 '20 at 20:52
  • This is a solid piece of refuctoring. Keith BankAccount from the Institute of Mortgage Driven Development couldn't have done it better! – shuckster Aug 06 '20 at 22:01

1 Answers1

1

You can use this one liner solution with ecmascript6:

const uniqueArray = [...new Set([1,2,3,3,3,4])];
console.log(uniqueArray);

A set is a data structure that holds uniquely values, without order. The trick to know is destructuring a set returns an array

Maxime Helen
  • 1,382
  • 7
  • 16
  • This is good, and to be honest the best method if I was trying to expose the _beauty_ of javascript, but I need it to be as redundant as possible, effectively we are using this as one of the filtering methods mentioned above. – memerson Aug 06 '20 at 20:22
  • 2
    @memerson : consider giving an example of data that will not get de-duped by above code to make some sense of *redundancy* you're talking about – Yevhen Horbunkov Aug 06 '20 at 20:25
  • Shoutout to @Dominik Matis who answered to this first in comments – Maxime Helen Aug 07 '20 at 00:39