45

Assuming an array of objects as follows:

const listOfTags = [
    {id: 1, label: "Hello", color: "red", sorting: 0},
    {id: 2, label: "World", color: "green", sorting: 1},
    {id: 3, label: "Hello", color: "blue", sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
    {id: 5, label: "Hello", color: "red", sorting: 6},
]

A duplicate entry would be if label and color are the same. In this case Objects with id = 1 and id = 5 are duplicates.

How can I filter this array and remove duplicates?

I know solutions where you can filter against one key with something like:

const unique = [... new Set(listOfTags.map(tag => tag.label)]

But what about multiple keys?

As per request in comment, here the desired result:

[
    {id: 1, label: "Hello", color: "red", sorting: 0},
    {id: 2, label: "World", color: "green", sorting: 1},
    {id: 3, label: "Hello", color: "blue", sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
]
dreftymac
  • 31,404
  • 26
  • 119
  • 182
Andy
  • 549
  • 1
  • 6
  • 12

14 Answers14

37

Late one, but I don't know why nobody suggests something much simpler:

listOfTags.filter((tag, index, array) => array.findIndex(t => t.color == tag.color && t.label == tag.label) == index);
Sebastien C.
  • 2,099
  • 17
  • 21
35

You could use a Set in a closure for filtering.

const
    listOfTags = [{ id: 1, label: "Hello", color: "red", sorting: 0 }, { id: 2, label: "World", color: "green", sorting: 1 }, { id: 3, label: "Hello", color: "blue", sorting: 4 }, { id: 4, label: "Sunshine", color: "yellow", sorting: 5 }, { id: 5, label: "Hello", color: "red", sorting: 6 }],
    keys = ['label', 'color'],
    filtered = listOfTags.filter(
        (s => o => 
            (k => !s.has(k) && s.add(k))
            (keys.map(k => o[k]).join('|'))
        )
        (new Set)
    );

console.log(filtered);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Nina Scholz
  • 376,160
  • 25
  • 347
  • 392
  • 19
    Some comments on the code would be great for some developers who are not so into closures. This is a great example where closures are a good fit. For those who are interested in: The first function inside `listOfTags.filter` is a factory function, which is called immediately with a new empty set `s`. `s` will be available until the filtering is done. The second function is the actual filter function. It is called with every object `o` and returns a boolean. (In this case another closure function makes the actual filter test, with the concatenated fields of the object `o` as params.) – Dave Gööck Oct 09 '19 at 15:20
  • what is `s` and `o`? – Alfrex92 Sep 30 '20 at 08:40
  • 1
    @Alfrex92, `s` is a closure over `new Set` and `o` is just every object of the array. – Nina Scholz Sep 30 '20 at 08:42
  • 1
    @Alfrex92, `k` is another closure over the next line `keys.map(k => o[k]).join('|')` of a joint key of some properties. – Nina Scholz Sep 30 '20 at 08:58
  • @NinaScholz - this [article](https://codeburst.io/javascript-array-distinct-5edc93501dc4) speak to performance. Have you done any performance tests on your method? – Peter Mendez Apr 06 '21 at 23:43
  • What a life saver! I'll get back to this after my project to understand this one. – JCm Sep 03 '21 at 14:31
  • 1
    should be written with cleaner code, using single characters is very confusing – Redbeard Nov 08 '22 at 14:16
11

const listOfTags = [
    {id: 1, label: "Hello", color: "red", sorting: 0},
    {id: 2, label: "World", color: "green", sorting: 1},
    {id: 3, label: "Hello", color: "blue", sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
    {id: 5, label: "Hello", color: "red", sorting: 6},
]

const unique = [];

listOfTags.map(x => unique.filter(a => a.label == x.label && a.color == x.color).length > 0 ? null : unique.push(x));

console.log(unique);
COLBY BROOKS
  • 331
  • 1
  • 5
7

One way is create an object (or Map) that uses a combination of the 2 values as keys and current object as value then get the values from that object

const listOfTags = [
    {id: 1, label: "Hello", color: "red", sorting: 0},
    {id: 2, label: "World", color: "green", sorting: 1},
    {id: 3, label: "Hello", color: "blue", sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
    {id: 5, label: "Hello", color: "red", sorting: 6},
]

const uniques = Object.values(
  listOfTags.reduce((a, c) => {
    a[c.label + '|' + c.color] = c;
    return a
  }, {}))

console.log(uniques)
charlietfl
  • 170,828
  • 13
  • 121
  • 150
4

I would tackle this by putting this into temporary Map with a composite key based on the properties you're interested in. For example:

const foo = new Map();
for(const tag of listOfTags) {
  foo.set(tag.id + '-' tag.color, tag);
}
Evert
  • 93,428
  • 18
  • 118
  • 189
  • that was one of my first thoughts too, but I find the string concatenation not very elegant. – Andy Nov 29 '18 at 16:11
  • 4
    @Andy it's not that strange. This is basically how hashmaps works, which is the appropriate data structure for this type of thing. – Evert Nov 29 '18 at 17:18
4

Based on the assumption that values can be converted to strings, you can call

distinct(listOfTags, ["label", "color"])

where distinct is:

/**
 * @param {array} arr The array you want to filter for dublicates
 * @param {array<string>} indexedKeys The keys that form the compound key
 *     which is used to filter dublicates
 * @param {boolean} isPrioritizeFormer Set this to true, if you want to remove
 *     dublicates that occur later, false, if you want those to be removed
 *     that occur later.
 */
const distinct = (arr, indexedKeys, isPrioritizeFormer = true) => {
    const lookup = new Map();
    const makeIndex = el => indexedKeys.reduce(
        (index, key) => `${index};;${el[key]}`, ''
    );
    arr.forEach(el => {
        const index = makeIndex(el);
        if (lookup.has(index) && isPrioritizeFormer) {
            return;
        }
        lookup.set(index, el);
    });

    return Array.from(lookup.values());
};

Sidenote: If you use distinct(listOfTags, ["label", "color"], false), it will return:

[
    {id: 1, label: "Hello", color: "red", sorting: 6},
    {id: 2, label: "World", color: "green", sorting: 1},
    {id: 3, label: "Hello", color: "blue", sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
]
Felix K.
  • 14,171
  • 9
  • 58
  • 72
4

You can use reduce here to get filtered objects.

listOfTags.reduce((newListOfTags, current) => {
    if (!newListOfTags.some(x => x.label == current.label && x.color == current.color)) {
        newListOfTags.push(current);
    }
    return newListOfTags;
}, []);
Ankit Arya
  • 872
  • 7
  • 10
3
const keys = ['label', 'color'],
const mySet = new Set();
const duplicateSet = new Set();
const result = objList.filter((item) => {
  let newItem = keys.map((k) => item[k]).join("-");
  mySet.has(newItem) && duplicateSet.add(newItem);
  return !mySet.has(newItem) && mySet.add(newItem);
});

console.log(duplicateSet, result);

This can be used to filter duplicate and non duplicate

2

We can find the unique value by the below script, we can expand the array using forEach loop and check the value exists on the new array by using some() method and after that create the new array by using push() method.

const arr = [{ id: 1 }, { id: 2 }, { id: 4 }, { id: 1 }, { id: 4 }];
        var newArr =[];
        arr.forEach((item)=>{ 
            if(newArr.some(el => el.id === item.id)===false){
                newArr.push(item);
            }  
        }
        ); 
        console.log(newArr);
       //[{id: 1}, {id: 2}, {id: 4}];
Kamal
  • 796
  • 7
  • 11
1
const listOfTags = [
    {id: 1, label: "Hello", color: "red", sorting: 0},
    {id: 2, label: "World", color: "green", sorting: 1},
    {id: 3, label: "Hello", color: "blue", sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
    {id: 5, label: "Hello", color: "red", sorting: 6},
];

let keysList = Object.keys(listOfTags[0]); // Get First index Keys else please add your desired array

let unq_List = [];

keysList.map(keyEle=>{
  if(unq_List.length===0){
      unq_List = [...unqFun(listOfTags,keyEle)];
  }else{
      unq_List = [...unqFun(unq_List,keyEle)];
  }
});

function unqFun(array,key){
    return [...new Map(array.map(o=>[o[key],o])).values()]
}

console.log(unq_List);
1

const listOfTags = [
    {id: 1, label: "Hello", color: "red", sorting: 0},
    {id: 2, label: "World", color: "green", sorting: 1},
    {id: 3, label: "Hello", color: "blue", sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
    {id: 5, label: "Hello", color: "red", sorting: 6},
];

 const objRes=listOfTags.filter((v,i,s)=>s.findIndex(v2=>['label','color'].every(k=>v2[k]===v[k]))===i);
    console.log(objRes);
Ravindr
  • 186
  • 3
  • 15
0

Maybe helpful. Extract duplicate items from array then delete all duplicates

// Initial database data
[
    { key: "search", en:"Search" },
    { key: "search", en:"" },
    { key: "alert", en:"Alert" },
    { key: "alert", en:"" },
    { key: "alert", en:"" }
]


// Function called
async function removeDuplicateItems() {
    try {
        // get data from database
        const { data } = (await getList());
        
        // array reduce method for obj.key
        const reduceMethod = data.reduce((x, y) => {
            x[y.key] = ++x[y.key] || 0;
            return x;
        }, {});

        // find duplicate items by key and checked whether "en" attribute also has value
        const duplicateItems = data.filter(obj => !obj.en && reduceMethod[obj.key]);
        console.log('duplicateItems', duplicateItems);

        // remove all dublicate items by id
        duplicateItems.forEach(async (obj) => {
            const deleteResponse = (await deleteItem(obj.id)).data;
            console.log('Deleted item: ', deleteResponse);
        });

    } catch (error) {
        console.log('error', error);
    }
}


// Now database data: 
[
    { key: "search", en:"Search" },
    { key: "alert", en:"Alert" }
]
Nijat Aliyev
  • 558
  • 6
  • 15
0

One solution is to iterate the array and use a Map of maps to store the value-value pairs that have been encountered so far.

Looking up duplicates this way should be reasonably fast (compared to nested loops or .filter + .find approach).

Also the values could be any primitive type; they are not stringified or concatenated for comparison (which could lead to incorrect comparison).

const listOfTags = [
    {id: 1, label: "Hello",    color: "red",    sorting: 0},
    {id: 2, label: "World",    color: "green",  sorting: 1},
    {id: 3, label: "Hello",    color: "blue",   sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
    {id: 5, label: "Hello",    color: "red",    sorting: 6}
];

let map = new Map();
let result = [];
listOfTags.forEach(function(obj) {
    if (map.has(obj.label) === false) {
        map.set(obj.label, new Map());
    }
    if (map.get(obj.label).has(obj.color) === false) {
        map.get(obj.label).set(obj.color, true);
        result.push(obj)
    }
});
console.log(result);
Salman A
  • 262,204
  • 82
  • 430
  • 521
0
const listOfTags = [
    {id: 1, label: "Hello", color: "red", sorting: 0},
    {id: 2, label: "World", color: "green", sorting: 1},
    {id: 3, label: "Hello", color: "blue", sorting: 4},
    {id: 4, label: "Sunshine", color: "yellow", sorting: 5},
    {id: 5, label: "Hello", color: "red", sorting: 6},
]

const removeDuplicate = listOfTags.reduce((prev, item) => {
  const label = item.label;
  const color = item.color;
  const tempPrev = prev;
  if (tempPrev.length) {
    const isExistInPrevArray = tempPrev.some((i) => i.label ===  label && i.color === color);
    if (isExistInPrevArray == false) {
      return [...tempPrev, ...[item]];
    } else {
      return [...tempPrev, ...[]];
    }
  } else {
    return [...tempPrev, ...[item]];
  }
}, []);

console.log('removeDuplicate',removeDuplicate)

Output

removeDuplicate [
  { id: 1, label: 'Hello', color: 'red', sorting: 0 },
  { id: 2, label: 'World', color: 'green', sorting: 1 },
  { id: 3, label: 'Hello', color: 'blue', sorting: 4 },
  { id: 4, label: 'Sunshine', color: 'yellow', sorting: 5 }
]
Hardik Desai
  • 1,089
  • 12
  • 20