0

As the title suggests, I am trying to create a single array of strings:

['string1', 'string2', 'string3', 'string4', 'string5', 'string6']

Out of an array of objects with arrays of strings within those objects:

[
  {
    array_of_strings: ['string1', 'string2', 'string3']
  },
  {
    array_of_strings: ['string1', 'string4', 'string5']
  },
  {
    array_of_strings: ['string6', 'string3', 'string2']
  }
]

As you can see there is a possibility that the nested arrays may contain the same strings as each other, and I am trying to de-duplicate at the same time. I got very lost in map, filter and reduce. But nothing really outputs the data as needed.

Any guidance would be appreciated.

alphadmon
  • 396
  • 4
  • 17

4 Answers4

1

If I understand your question correctly(to get all the elements appear more than once),we can use Array.filter() ,Array.reduce(), Array.map() to do it

let data = [
  {
    array_of_strings: ['string1', 'string2', 'string3']
  },
  {
    array_of_strings: ['string1', 'string4', 'string5']
  },
  {
    array_of_strings: ['string6', 'string3', 'string2']
  }
]

data = data.map(d => d.array_of_strings).flat().reduce((a,v) =>{
  a[v] = a[v]??0
  a[v] += 1
  return a
},{})

let result = Object.entries(data).filter(i => i[1] > 1).map(i => i[0])
console.log(result)
flyingfox
  • 13,414
  • 3
  • 24
  • 39
  • OP has clarified the question now, they don't want to "*get all the elements appear more than once*". Nice solution for that, though! – Bergi Dec 01 '22 at 20:30
1

You can use a javascript Set for this, check the docs

const array = [
  {
    array_of_strings: ["string1", "string2", "string3"],
  },
  {
    array_of_strings: ["string1", "string4", "string5"],
  },
  {
    array_of_strings: ["string6", "string3", "string2"],
  },
];

const result = array.reduce((acc, { array_of_strings }) => {
  return acc.concat(array_of_strings);
}, []);

const unique = new Set(result);

const uniqueArray = Array.from(unique);

console.log(uniqueArray);
Xiduzo
  • 897
  • 8
  • 24
1

First you want to "regroup" all of thoses strings together in the same array :

const fullDatas = [
  {
    array_of_strings: ["string1", "string2", "string3"],
  },
  {
    array_of_strings: ["string1", "string4", "string5"],
  },
  {
    array_of_strings: ["string6", "string3", "string2"],
  },
];

const allStrings = fullDatas.reduce((acc, currentValue)) => {
  return acc.concat(currentValue.array_of_strings);
}, []);

then you can have a look at this thread

and use it :

const uniq = [...new Set(allStrings)];
Seba99
  • 1,197
  • 14
  • 38
1

You need none of map/filter/reduce :-) Use flatMap, then deduplicate:

const data = [
  {
    array_of_strings: ['string1', 'string2', 'string3']
  },
  {
    array_of_strings: ['string1', 'string4', 'string5']
  },
  {
    array_of_strings: ['string6', 'string3', 'string2']
  }
];

const result = Array.from(new Set(data.flatMap(o => o.array_of_strings)));
console.log(result);
Bergi
  • 630,263
  • 148
  • 957
  • 1,375
  • although concise, this is not particularly efficient. You're creating two potentially big intermediate objects just to throw them away a second later. – gog Dec 01 '22 at 08:43
  • @gog I can only see one potentially big object (the `flatMap` result)? And its size is still linear to the size of the input. In any case it's not less efficient than any of the other answers. – Bergi Dec 01 '22 at 08:46
  • the second temp one is `Set`. That said, I don't know how smart JS compilers are. When you write `map(...).filter(...).map etc` do they actually allocate those intermediates? Might be worth asking on SO ;) – gog Dec 01 '22 at 08:56
  • @gog The set I wouldn't call "big", it's linear to the size of the output. And surely it's necessary for the deduplication, ensuring linear runtime complexity. Yes, it's temporarily allocated, but there's nothing wrong with that. (Re chaining array methods: js engines do create all the intermediate array objects. You'd need to chain [iterator methods](https://github.com/tc39/proposal-iterator-helpers) to avoid the space overhead. But garbage-collectors are optimised well for this, it's not a problem) – Bergi Dec 01 '22 at 09:02