TL;DR
const records = array1.map((a, i) => [a, array2[i], array3[i]]);
const index = {};
records.filter(column => {
const key = JSON.stringify(column);
return key in index ? false : index[key] = true;
});
Huh?
There are a lot of ways to solve this, with varying degrees of efficiency, and the best solution will depend on the size of your data. A simple but naïve solution iterates over each "column" and checks all of the preceding columns for equality. It looks like this:
const array1 = [ 'A', 'B', 'A', 'B'];
const array2 = [ 5, 5, 7, 5];
const array3 = [true,true,true,true];
const newArray1 = array1.slice(0,1); // column 0 is never duplicate
const newArray2 = array2.slice(0,1);
const newArray3 = array3.slice(0,1);
// loop over columns starting with index 1
outer: for (let i = 1; i < array1.length; i++) {
const a = array1[i];
const b = array2[i];
const c = array3[i];
// check all preceding columns for equality
for (let j = 0; j < i; j++) {
if (a === array1[j] && b === array2[j] && c === array3[j]) {
// duplicate; continue at top of outer loop
continue outer;
}
}
// not a duplicate; add to new arrays
newArray1.push(a);
newArray2.push(b);
newArray3.push(c);
}
console.log(newArray1);
console.log(newArray2);
console.log(newArray3);
.as-console-wrapper{min-height:100%}
As you can see, we have to check each row within each column for equality, every time. If you're curious, the complexity of this is ((+1)/2) (technically ((+1)/2), where is 3 for three columns).
For a larger data sets it's advantageous to keep track of values you've already seen in a data structure that's quick to access: A hash, a.k.a. a JavaScript object. Since all of your values are primitive, a quick way to construct a key is JSON.stringify
. Some might consider this a "hack"—and it's important to note that it will fail with values that can't be represented in JSON, e.g. Infinity
or NaN
—but it's a fast and easy one with data this simple.
const array1 = ['A', 'B', 'A', 'B'];
const array2 = [5, 5, 7, 5];
const array3 = [true, true, true, true];
const newArray1 = [];
const newArray2 = [];
const newArray3 = [];
const index = {};
for (let i = 0; i < array1.length; i++) {
const a = array1[i];
const b = array2[i];
const c = array3[i];
const key = JSON.stringify([a,b,c]);
if (key in index) {
// duplicate; skip to top of loop
continue;
}
// not a duplicate; record in index and add to new arrays
index[key] = true;
newArray1.push(a);
newArray2.push(b);
newArray3.push(c);
}
console.log(newArray1);
console.log(newArray2);
console.log(newArray3);
.as-console-wrapper{min-height:100%}
The complexity of this is (), or maybe (2) where , again,
is 3 for three columns, and the 2 is another to very roughly account for the cost of JSON.stringify
. (Figuring out the cost of hash access is left as an exercise for the pedants among us; I'm content to call it (1).)
That's still pretty verbose. Part of the reason is that using three different variables for the data—which is really a single "table"—leads to a lot of repetition. We can preprocess the data to make it easier to deal with. Once it's "transposed" into a single two-dimensional array, we can use Array.prototype.filter
with the key technique from above, for some very terse code:
const array1 = ['A', 'B', 'A', 'B'];
const array2 = [5, 5, 7, 5];
const array3 = [true, true, true, true];
// turn "columns" into "rows" of a 2D array
const records = array1.map((a, i) => [a, array2[i], array3[i]]);
const index = {};
const newData = records.filter(column => {
const key = JSON.stringify(column);
return key in index ? false : index[key] = true;
});
console.log(newData);
.as-console-wrapper{min-height:100%}
Of course, pre-processing isn't free, so this code isn't any more performant than the more verbose version; you'll have to decide how important that is to you. If you want you can now extract the columns from newData
into three variables (newData.forEach(([a,b,c]) => { newArray1.push(a); newArray2.push(b); /* ... */ })
), but for many purposes the "transposed" 2D array will be easier to work with.