EDIT: Misunderstood OP and it's not an intersect, but extracting values that are unique (e.g. NOT the intersection) between the individual arrays, for that this might work:
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
const thereCanOnlyBeOne = function(...arrs) {
return Array.from(
arrs.reduce((map, arr) => {
new Set(arr).forEach((v) => map.set(v, map.has(v) ? map.get(v)+1 : 1));
return map;
}, new Map())
)
.filter(([value, count]) => count === 1)
.map(([value, count]) => value);
};
console.log(thereCanOnlyBeOne(arr1, arr2, arr3));
I would think @gog's answer is way more sophisticated and probably much faster, but i have a slightly hard time wrapping my head around it (call me stupid, i take it =D, EDIT: had to do some research, read/learn something about bitsets here and here), so here's the breakdown of the slightly convoluted way of doing this with a Map and array methods:
- pass all arrays to be analyzed into function, order doesn't matter
- Loop (i chose reduce, but any loop structure works) trough all input arrays and their values, counting up occurrences in the Map, at the end the Map will look as follows:
0: {"a" => 4}
1: {"b" => 3}
2: {"c" => 3}
3: {"d" => 1}
4: {"f" => 1}
- Once done with that, we convert the Map back into an array via
Array.from()
creating an array of tuples:
[
["a", 4],
["b", 3],
["c", 3],
["d", 1],
["f", 1],
]
- Filter that resulting array of tuples (now in the form of
[<value>, <count>]
to only be left with values that exactly occurred once, leaving us with:
[
["d", 1],
["f", 1],
]
- Map over the filtered array to "dumb" it down into a one-dimensional array again and return the result:
["d", "f"]
WARNING: Internally this code does a ****load of loops, so call it a brute-force loop as well, it just looks "shorter" due to "sexy" ES6 array-syntax-sugar.
A slightly modified version for completeness as the Array.filter()
step can be omitted (although it seems to be faster) by iterating the counter-Map once it's finalized and simply deleting Map-entries that do not have value 1.
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
const thereCanOnlyBeOne = function(...arrs) {
let result;
arrs
.reduce((map, arr) => {
new Set(arr).forEach((v) => map.set(v, map.has(v) ? map.get(v)+1 : 1));
return map;
}, new Map())
// the result of .reduce will be a Map!
.forEach((value, key, map) => { value !== 1 && map.delete(key); result = map; });
return Array.from(result).map(([value, count]) => value);
};
console.log(thereCanOnlyBeOne(arr1, arr2, arr3));
UPDATE: as @Nick Parsons pointed out, the previous version of the code would not output elements that were only present in one array, but multiple times.
This will produce an incorrect output if one array contains the same value multiple times and that element isn't present in any other arrays. eg, if you remove b from arr2, then only arr1 has b in it but no others do, so it should b should be included in the final result.
This can easily be solved by turning the array that is checked into a Set()
(thereby reducing the arrays values to "unique" ones).
If anyone (besides me) wonders, here's a benchmark between gog's options and mine, his bitset approach is clearly the fastest, so if you are comparing less than 32 arrays, that's the most performant solution by far: https://jsben.ch/YkKSu
and if anyone prefers an ES6-ified version of gog's bitset implementation (improved by @ralphmerridew suggestion), here you go:
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
function onlyone(...arrays) {
return Object.entries(
arrays.reduce((map, arr, n) => {
arr.forEach((v) => map[v] = (map[v] ?? 0) | (1 << n));
return map;
}, {})
)
.filter(([value, bitmap]) => (bitmap & (bitmap-1)) == 0)
.map(([value, bitmap]) => value);
};
console.log(onlyone(arr1, arr2, arr3));
updated the benchmark with this as well, interestingly (or unexpectedly) this "slower"-looking ES6 implementation somehow beats gog's for-loop implementation by a tad, tested in chrome and firefox multiple times, as i couldn't believe it myself, thought those syntax-sugar methods slow things down slightly compared to for loops, well...good to know =)
I also tried implementing the bitset approach with BigInt() to eliminate the issue with it only being able to deal with 32 arrays (depending on the Engine with BigInt it should be possible to deal with 1 million to 1 billion arrays), unfortunately that seems to make it the slowest of all solutions (benchmark updated):
let arr1 = ['a', 'b', 'c', 'a', 'b'];
let arr2 = ['a', 'd', 'b', 'c'];
let arr3 = ['f', 'c', 'a'];
function onlyoneBigInt(...arrays) {
return Object.entries(
arrays.reduce((map, arr, n) => {
arr.forEach((v) => map[v] = (map[v] ?? 0n) | (1n << BigInt(n)));
return map;
}, {})
)
.filter(([value, bitmap]) => (bitmap & (bitmap-1n)) == 0)
.map(([value, bitmap]) => value);
};
console.log(onlyoneBigInt(arr1, arr2, arr3));
Maybe someone sees something that can be improved to make this faster?