882

I have an object that contains an array of objects.

obj = {};

obj.arr = new Array();

obj.arr.push({place:"here",name:"stuff"});
obj.arr.push({place:"there",name:"morestuff"});
obj.arr.push({place:"there",name:"morestuff"});

I'm wondering what is the best method to remove duplicate objects from an array. So for example, obj.arr would become...

{place:"here",name:"stuff"},
{place:"there",name:"morestuff"}
chickens
  • 19,976
  • 6
  • 58
  • 55
Travis
  • 10,192
  • 4
  • 19
  • 19
  • Do you mean how do you stop a hashtable/object with all the same parameters being added to an array? – Matthew Lock Feb 08 '10 at 00:46
  • 9
    Mathew -> If it is simpler to prevent a duplicate object from being added to the array in the first place, instead of filtering it out later, yes, that would be fine too. – Travis Feb 08 '10 at 01:01
  • Suuuper long answers and yet MDN has possibly the shortest: ```arrayWithNoDuplicates = Array.from(new Set(myArray))``` – tonkatata Dec 06 '21 at 21:47
  • 7
    @tonkatata This doesn't work with array of objects. – Debu Shinobi Dec 14 '21 at 07:50
  • Hello, Please find below a simple and reusable way to manage duplicates https://stackoverflow.com/a/74544470/12930883 – RED-ONE Nov 23 '22 at 09:36

77 Answers77

932

How about with some ES6 magic?

obj.arr = obj.arr.filter((value, index, self) =>
  index === self.findIndex((t) => (
    t.place === value.place && t.name === value.name
  ))
)

Reference URL

A more generic solution would be:

const uniqueArray = obj.arr.filter((value, index) => {
  const _value = JSON.stringify(value);
  return index === obj.arr.findIndex(obj => {
    return JSON.stringify(obj) === _value;
  });
});

Using the above property strategy instead of JSON.stringify:

const isPropValuesEqual = (subject, target, propNames) =>
  propNames.every(propName => subject[propName] === target[propName]);

const getUniqueItemsByProperties = (items, propNames) => 
  items.filter((item, index, array) =>
    index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNames))
  );

You can add a wrapper if you want the propNames property to be either an array or a value:

const getUniqueItemsByProperties = (items, propNames) => {
  const propNamesArray = Array.from(propNames);

  return items.filter((item, index, array) =>
    index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNamesArray))
  );
};

allowing both getUniqueItemsByProperties('a') and getUniqueItemsByProperties(['a']);

Stackblitz Example

Explanation

  • Start by understanding the two methods used:
  • Next take your idea of what makes your two objects equal and keep that in mind.
  • We can detect something as a duplicate, if it satisfies the criterion that we have just thought of, but its position is not at the first instance of an object with the criterion.
  • Therefore we can use the above criterion to determine if something is a duplicate.
danronmoon
  • 3,814
  • 5
  • 34
  • 56
Eydrian
  • 10,258
  • 3
  • 19
  • 29
  • 115
    This can be shortened to: `things.thing = things.thing.filter((thing, index, self) => self.findIndex(t => t.place === thing.place && t.name === thing.name) === index)` – Josh Cole Mar 13 '17 at 12:12
  • Works perfectly! var uniqueArrayOfObjects = arrayOfObjects.filter(function(obj, index, self) { return index === self.findIndex(function(t) { return t['obj-property'] === obj['obj-property'] }); }); Make sure you use proper JS syntax. – Mohamed Salem Lamiri Nov 14 '17 at 12:55
  • Those are proper JS syntax. Yours is not using 1) fat arrow functions 2) implicit return or 3) dot notation. Yours is ES5 syntax. The others are mostly ES6 (ECMA2015). All are valid in 2017. See jaredwilli's comment. – agm1984 Nov 15 '17 at 09:20
  • They are also using short circuiting along with implicit return, which is that double ampersand `&&` stuff. Definitely not your weekend stroll in the park type stuff. Brutal to read, extremely efficient. – agm1984 Nov 15 '17 at 09:26
  • This doesn't answer the title but only the very specific example in the body of the question. not robust :/ – vsync Jul 09 '18 at 16:59
  • This can again be shortened to: things.thing = things.thing.filter((thing, index, self) => self.findIndex(t => JSON.stringify(t) === JSON.stringify(thing)) === index) so that we don't have to check each and every item inside an object – BKM Jul 17 '18 at 15:00
  • 10
    @vsync just take @BKM's answer and put it together, a generic solution would be: `const uniqueArray = arrayOfObjects.filter((object,index) => index === arrayOfObjects.findIndex(obj => JSON.stringify(obj) === JSON.stringify(object)));` http://jsfiddle.net/x9ku0p7L/28/ – Eydrian Jul 18 '18 at 11:33
  • 33
    The key here is that the findIndex() method returns the index of the **first** element, so if there is a second element that matches, it will never be found and added during the filter. I was staring at it for a minute :) – JBaczuk Sep 13 '19 at 23:21
  • In case anyone needs a short utility function: export const uniqueArray = arr => arr.filter((obj, i) => i === arr.findIndex(o => JSON.stringify(o) === JSON.stringify(obj))) – Andrew Bogdanov Oct 10 '19 at 12:59
  • 1
    The "more generic solution" might not work in some cases as the order of `JSON.stringify` is unpredictable. Besides you are not caching the `JSON.stringify(obj)` so it will be called more times than you expect it to which is a major performance hit – ibrahim mahrir Nov 06 '19 at 11:25
  • @ibrahimmahrir thanks, I updated it accordingly to your input – Eydrian Nov 07 '19 at 07:31
  • `things.thing = things.thing.filter((thing, index, self) => index === self.indexOf(thing) )` could work. – Yousername Dec 23 '19 at 23:40
  • 3
    One question, wouldn't this be an O(n^2) approach. In case I'm working with 30 records, I'd be doing 900 iterations, right? (Worst case scenario, no duplicates) – Jose A Apr 23 '20 at 10:29
  • 9
    If you have an array with 200,000 entries then this will take 40 BILLION iterations. This should never be used with a large array. Always use a map instead. – JP_ Feb 23 '21 at 22:54
  • i think this one is more cleaner to use with out compareing with findIndex occurences `const newArray = arr.map(JSON.stringify).filter((el , i , arr)=> i === arr.indexOf(el)).map(JSON.parse)` its works perfectly fine try it once – jsBug Jun 09 '22 at 12:39
  • strange thing is code works no matter if I change the property names `place` & `name` in `findIndex` according to my own object or not – Mani Jun 28 '22 at 07:37
  • O(n Log[n]) solution is sort array then dedup in one pass. Not preserving order though. – Karatekid430 Aug 18 '22 at 05:51
  • Take note that using `JSON.stringify` will make the order of the properties important. So it's not good for general use. – dorsta Feb 15 '23 at 14:10
497

One liners with filter ( Preserves order )

Find unique id's in an array.

arr.filter((v,i,a)=>a.findIndex(v2=>(v2.id===v.id))===i)

If the order is not important, map solutions will be faster: Solution with map


Unique by multiple properties ( place and name )

arr.filter((v,i,a)=>a.findIndex(v2=>['place','name'].every(k=>v2[k] ===v[k]))===i)

Unique by all properties (This will be slow for large arrays)

arr.filter((v,i,a)=>a.findIndex(v2=>(JSON.stringify(v2) === JSON.stringify(v)))===i)

Keep the last occurrence by replacing findIndex with findLastIndex.

arr.filter((v,i,a)=>a.findLastIndex(v2=>(v2.place === v.place))===i)
chickens
  • 19,976
  • 6
  • 58
  • 55
  • 52
    v,i,a == value, index, array – James B Oct 09 '20 at 18:23
  • This worked great to find if the key,value pairs in my vue modal had duplicates. +1 – Arriel Feb 01 '21 at 22:23
  • 8
    arr.filter((v,i,a)=>a.findIndex(t=>(JSON.stringify(t) === JSON.stringify(v)))===i) this will not work if the keys are not in the same order – Jamal Hussain Mar 19 '21 at 14:11
  • what `t` on `findIndex` stands for? – Bernardo Marques Jun 10 '21 at 03:10
  • 1
    simply BEAUTIFUL – avalanche1 Nov 30 '21 at 15:15
  • 3
    This would be better if it had an explanation of what these did. And if they used legible naming conventions instead of trying to pre-minify the code. – Heretic Monkey Jan 26 '22 at 18:26
  • could this work if we had to search for a nested object property also? For example, for nested properties of name and place, inside a preferences attribute? {'id', 'preferences' : {place: ....., name: ...}} – nagiatzi May 24 '22 at 12:00
  • /* This is the function version */ export const dedup = (list, names) => list.filter((v,i,a)=>a.findIndex(v2=> names.every(k=>v2[k] ===v[k]))===i) – Houcheng Jun 09 '22 at 03:39
  • this is simple code and cleaner to understand `const newArray = arr.map(JSON.stringify).filter((el , i , arr)=> i === arr.indexOf(el)).map(JSON.parse)` without findIndex occurences – jsBug Jun 09 '22 at 12:42
  • Here is with an arbitrary predicate that decides duplicateness: `function uniqueByPredicate(arr, predicate) { return l.filter((v1, i, a) => a.findIndex(v2 => predicate(v1, v2)) === i); }` where `predicate` must be of type `(a: T, b: T) => boolean`. – zr0gravity7 Oct 29 '22 at 20:51
302

Using ES6+ in a single line you can get a unique list of objects by key:

const key = 'place';
const unique = [...new Map(arr.map(item => [item[key], item])).values()]

It can be put into a function:

function getUniqueListBy(arr, key) {
    return [...new Map(arr.map(item => [item[key], item])).values()]
}

Here is a working example:

const arr = [
    {place: "here",  name: "x", other: "other stuff1" },
    {place: "there", name: "x", other: "other stuff2" },
    {place: "here",  name: "y", other: "other stuff4" },
    {place: "here",  name: "z", other: "other stuff5" }
]

function getUniqueListBy(arr, key) {
    return [...new Map(arr.map(item => [item[key], item])).values()]
}

const arr1 = getUniqueListBy(arr, 'place')

console.log("Unique by place")
console.log(JSON.stringify(arr1))

console.log("\nUnique by name")
const arr2 = getUniqueListBy(arr, 'name')

console.log(JSON.stringify(arr2))

How does it work

First the array is remapped in a way that it can be used as an input for a Map.

arr.map(item => [item[key], item]);

which means each item of the array will be transformed in another array with 2 elements; the selected key as first element and the entire initial item as second element, this is called an entry (ex. array entries, map entries). And here is the official doc with an example showing how to add array entries in Map constructor.

Example when key is place:

[["here", {place: "here",  name: "x", other: "other stuff1" }], ...]

Secondly, we pass this modified array to the Map constructor and here is the magic happening. Map will eliminate the duplicate keys values, keeping only last inserted value of the same key. Note: Map keeps the order of insertion. (check difference between Map and object)

new Map(entry array just mapped above)

Third we use the map values to retrieve the original items, but this time without duplicates.

new Map(mappedArr).values()

And last one is to add those values into a fresh new array so that it can look as the initial structure and return that:

return [...new Map(mappedArr).values()]

CPHPython
  • 12,379
  • 5
  • 59
  • 71
V. Sambor
  • 12,361
  • 6
  • 46
  • 65
  • This does not answer the original question as this is searches for an `id`. The question needs the entire object to be unique across all fields such as `place` and `name` – L. Holanda Dec 10 '19 at 18:54
  • Your ES6 function seems very concise and practical. Can you explain it a bit more? What is happening exactly? Are first or last duplicates removed? Or is it random, which duplicate gets removed? That would be helpful, thanks. – David Schumann Mar 25 '20 at 17:15
  • As far as i can tell, a Map with the property value as key is created. But it is not 100% how or if the order of the array is preserved. – David Schumann Mar 25 '20 at 18:16
  • 2
    Hi @DavidSchumann, I will update the answer and will explain how it works. But for short answer the order is preserved and the first one are removed... Just think about how it is inserted in the map... it checks if the key already exists it will update it, therfore the last one will remain – V. Sambor Mar 25 '20 at 18:21
  • There is an error in case of null items or some items without the called key, any fix for this? – Milad Abooali Jun 04 '21 at 09:17
  • @MiladAbooali can you give me an example which does not work. I will look into it. thanks – V. Sambor Jun 04 '21 at 13:17
  • I fixed that by added a checker for the lenght of each item. – Milad Abooali Jun 13 '21 at 13:32
  • Hi All, if i have 2 more records { place: "", name: "x", other: "other stuff1" }, { place: "", name: "x", other: "other stuff2" } , like this and I want empty based also then? – Shaik Habeeb Jun 17 '21 at 08:12
  • 7
    TS version, incase anyone is looking: ```export const unique = (arr: T[], key: string): T[] => [ ...new Map(arr.map((item: T) => [item[key], item])).values() ]; ``` – readikus Aug 16 '22 at 07:55
242

Simple and performant solution with a better runtime than the 70+ answers that already exist:

const ids = arr.map(({ id }) => id);
const filtered = arr.filter(({ id }, index) => !ids.includes(id, index + 1));

Example:

const arr = [{
  id: 1,
  name: 'one'
}, {
  id: 2,
  name: 'two'
}, {
  id: 1,
  name: 'one'
}];

const ids = arr.map(({ id }) => id);
const filtered = arr.filter(({ id }, index) => !ids.includes(id, index + 1));

console.log(filtered);

How it works:

Array.filter() removes all duplicate objects by checking if the previously mapped id-array includes the current id ({id} destructs the object into only its id). To only filter out actual duplicates, it is using Array.includes()'s second parameter fromIndex with index + 1 which will ignore the current object and all previous.

Since every iteration of the filter callback method will only search the array beginning at the current index + 1, this also dramatically reduces the runtime because only objects not previously filtered get checked.

What if you don't have a single unique identifier like id?

Just create a temporary one:

const objToId = ({ name, city, birthyear }) => `${name}-${city}-${birthyear}`;


const ids = arr.map(objToId);
const filtered = arr.filter((item, index) => !ids.includes(objToId(item), index + 1));
leonheess
  • 16,068
  • 14
  • 77
  • 112
  • Would it make sense to use a Set instead of an array? – user239558 Feb 20 '21 at 18:00
  • 2
    @user239558 Good question but not really, it would be orders of magnitude slower and for objects with a different order like `{id: 1, name: 'one'}` and `{namd: 'one', id: 1}` it would fail to detect the duplicate. – leonheess Feb 21 '21 at 23:52
  • 1
    what is this magic with { id } you're pulling here? I'm following everything else. Was about to implement a Set for my own purposes but found this – Timotronadon Mar 04 '21 at 17:34
  • 4
    Good question, @Timotronadon. `{ id }` is [destructuring](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Destructuring_assignment#object_destructuring) the object into only its `id`-key. To illustrate let's look at this these two loops: 1. `arr.forEach(object => console.log(object.id))` and 2. `arr.forEach({id} => console.log(id))`. They are both doing exactly the same thing: printing the `id`-key of all objects in `arr`. However, one is using destructuring and the other one is using a more conventional key access via the dot notation. – leonheess Mar 04 '21 at 23:54
  • You can even destructure an object into multiple keys like so: `{ id, name }`. – leonheess Mar 04 '21 at 23:57
  • 3
    def the best response here. Simple clean and elegant and works like a charm thank you! – d0rf47 Nov 19 '21 at 00:54
  • 1
    Amazing answer. This worked perfectly without using any external library. – SatelBill Mar 18 '22 at 03:37
  • @leonheess not silverbullet, as previous example here... try to implement recent items on some list, it's fall short... – Игор Ташевски Jun 28 '23 at 04:31
  • At first I liked that answer. But later I have to use another answer https://stackoverflow.com/a/56757215/1078641 because that one used the last object values for the key. – electroid Jun 29 '23 at 11:29
  • @ИгорТашевски I added an example how you can use this answer to support filtering with multiple keys if that's what you were trying to do. If that doesn't solve your issue just lmk – leonheess Jul 09 '23 at 16:13
  • @electroid see the above comment – leonheess Jul 09 '23 at 16:14
201

A primitive method would be:

const obj = {};

for (let i = 0, len = things.thing.length; i < len; i++) {
  obj[things.thing[i]['place']] = things.thing[i];
}

things.thing = new Array();

 for (const key in obj) { 
   things.thing.push(obj[key]);
}
Bricky
  • 2,572
  • 14
  • 30
aefxx
  • 24,835
  • 6
  • 45
  • 55
  • 78
    You should never user the length in the for loop, because it will slow everything down calculating it on every iteration. Assign it to a variable outside the loop and pass the variable instead of the things.thing.length. – Nosebleed Aug 26 '14 at 12:56
  • 16
    @aefxx I do not quite understand this function, how do you handle the situation that the "place" is same but name is different, should that be consider dup or not? – Kuan Jun 23 '15 at 21:48
  • 2
    Though this works, it does not take care of a sorted array since fetching keys is never order guaranteed. So, you end up sorting it again. Now, suppose the array was not sorted but yet its order is important, there is no way you can make sure that order stays intact – Deepak G M Apr 17 '19 at 06:31
  • 3
    @DeepakGM You're absolutely right. The answer won't (necessarily) preserve a given order. If that is a requirement, one should look for another solution. – aefxx Apr 17 '19 at 17:03
  • How could I modify the above to remove objects from an array that contain X as well as de-duped? – Ryan H Feb 09 '20 at 12:36
  • @RyanHolton Please ask a new question and reference this answer. It's cumbersome to try to help in comments only. – aefxx Feb 11 '20 at 14:15
  • You could also use `Object.values()` instead of the second loop – OzzyTheGiant Jul 23 '23 at 02:36
  • @OzzyTheGiant Feel free to update the answer! – aefxx Jul 23 '23 at 03:18
157

If you can use Javascript libraries such as underscore or lodash, I recommend having a look at _.uniq function in their libraries. From lodash:

_.uniq(array, [isSorted=false], [callback=_.identity], [thisArg])

Basically, you pass in the array that in here is an object literal and you pass in the attribute that you want to remove duplicates with in the original data array, like this:

var data = [{'name': 'Amir', 'surname': 'Rahnama'}, {'name': 'Amir', 'surname': 'Stevens'}];
var non_duplidated_data = _.uniq(data, 'name'); 

UPDATE: Lodash now has introduced a .uniqBy as well.

vsync
  • 118,978
  • 58
  • 307
  • 400
ambodi
  • 6,116
  • 2
  • 32
  • 22
  • 4
    @Praveen Pds: Did I say anything about underscore in the code example? I said 'lodash' has this function and underscore has similar ones. Before voting down, please read answers carefully. – ambodi Jan 25 '15 at 11:08
  • //Lists unique objects using _underscore.js holdingObject = _.uniq(holdingObject , function(item, key, name) { return item.name; }); – praveenpds Jan 26 '15 at 08:31
  • 39
    Note: you now need to use `uniqBy` instead of `uniq`, e.g. `_.uniqBy(data, 'name')`... documentation: https://lodash.com/docs#uniqBy – drmrbrewer Jun 14 '17 at 07:46
  • If you have a deep collection: `let data = [{'v': {'t':1, 'name':"foo"}}, {'v': {'t':1, 'name':"bar"}}];` do: `let uniq = _.uniqBy(data, 'v.t');` – Stas Sorokin Jan 02 '22 at 09:27
98

I had this exact same requirement, to remove duplicate objects in a array, based on duplicates on a single field. I found the code here: Javascript: Remove Duplicates from Array of Objects

So in my example, I'm removing any object from the array that has a duplicate licenseNum string value.

var arrayWithDuplicates = [
    {"type":"LICENSE", "licenseNum": "12345", state:"NV"},
    {"type":"LICENSE", "licenseNum": "A7846", state:"CA"},
    {"type":"LICENSE", "licenseNum": "12345", state:"OR"},
    {"type":"LICENSE", "licenseNum": "10849", state:"CA"},
    {"type":"LICENSE", "licenseNum": "B7037", state:"WA"},
    {"type":"LICENSE", "licenseNum": "12345", state:"NM"}
];

function removeDuplicates(originalArray, prop) {
     var newArray = [];
     var lookupObject  = {};

     for(var i in originalArray) {
        lookupObject[originalArray[i][prop]] = originalArray[i];
     }

     for(i in lookupObject) {
         newArray.push(lookupObject[i]);
     }
      return newArray;
 }

var uniqueArray = removeDuplicates(arrayWithDuplicates, "licenseNum");
console.log("uniqueArray is: " + JSON.stringify(uniqueArray));

The results:

uniqueArray is:

[{"type":"LICENSE","licenseNum":"10849","state":"CA"},
{"type":"LICENSE","licenseNum":"12345","state":"NM"},
{"type":"LICENSE","licenseNum":"A7846","state":"CA"},
{"type":"LICENSE","licenseNum":"B7037","state":"WA"}]
Vini.g.fer
  • 11,639
  • 16
  • 61
  • 90
James Drinkard
  • 15,342
  • 16
  • 114
  • 137
  • 1
    This would be more useful if the function could filter the 'falsy' objects too. `for(var i in array) { if(array[i][prop]){ //valid lookupObject[array[i][prop]] = array[i]; } else { console.log('falsy object'); } }` – Abdul Sadik Yalcin Nov 06 '17 at 17:49
  • Why not bring down the complexity 0(n) by using: `for (let i in originalArray) { if (lookupObject[originalArray[i]['id']] === undefined) { newArray.push(originalArray[i]); } lookupObject[originalArray[i]['id']] = originalArray[i]; }` – Tudor B. Feb 17 '19 at 22:17
  • this is the best way because it is important to know what it is that you want to not be duplicated. Now can this be done through reducer for e6 standards? – Christian Matthew Sep 16 '19 at 19:09
72

One liner using Set

var things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

// assign things.thing to myData for brevity
var myData = things.thing;

things.thing = Array.from(new Set(myData.map(JSON.stringify))).map(JSON.parse);

console.log(things.thing)

Explanation:

  1. new Set(myData.map(JSON.stringify)) creates a Set object using the stringified myData elements.
  2. Set object will ensure that every element is unique.
  3. Then I create an array based on the elements of the created set using Array.from.
  4. Finally, I use JSON.parse to convert stringified element back to an object.
Mμ.
  • 8,382
  • 3
  • 26
  • 36
  • 27
    the problem being {a: 1, b:2} wont be equal to {b:2,a:1} – PirateApp Oct 02 '17 at 10:00
  • 4
    keep in mind that there would be a problems with Date properties – MarkosyanArtur Mar 05 '18 at 06:31
  • This line creates random null values with a row object that do not exist in the original array of objects. Can you please help? – B1K Oct 16 '18 at 16:26
  • To address the issue @PirateApp pointed out in the comments the answer provided by @Mu can be modified as follows to handle objects with rearranged properties: `const distinct = (data, elements = []) => [...new Set(data.map(o => JSON.stringify(o, elements)))].map(o => JSON.parse(o));` Then when calling `distinct` just pass in the property names for the elements array. For the original post that would be `['place', 'name']`. For @PirateApp's example that would be `['a', 'b']`. – knot22 May 25 '22 at 13:31
61

ES6 one liner is here

let arr = [
  {id:1,name:"sravan ganji"},
  {id:2,name:"pinky"},
  {id:4,name:"mammu"},
  {id:3,name:"avy"},
  {id:3,name:"rashni"},
];

console.log(Object.values(arr.reduce((acc,cur)=>Object.assign(acc,{[cur.id]:cur}),{})))
sravan ganji
  • 4,774
  • 3
  • 25
  • 37
  • 6
    Nice and clean if you only want to remove objects with a single duplicate value, not so clean for fully duplicated objects. – David Barker May 29 '19 at 23:56
  • @DavidBarker you mean multiple duplicate values with an object ? – sravan ganji Sep 01 '20 at 20:08
  • yes, but more specifically objects that have all identical values. – David Barker Sep 02 '20 at 07:42
  • 1
    What is the functionality of `:cur` in `cur.id]:cur`? I dont understand this piece of the code. – Jonathan Arias Feb 17 '21 at 16:12
  • using lodash( _ ) we can do the same thing using `_.uniqBy(arr,'id')` – Akhil S Apr 05 '21 at 10:59
  • 1
    As is always the case, explanation of code is good. – Heretic Monkey Jan 26 '22 at 18:29
  • This considers objects only by `id`; but the object key `cur.id` is only one possible serialization of objects. An alternative could be `\`${cur.id}-${cur.name}\``, for example; this would consider _both_ properties (although better serializations would be needed, in general). Soon, [Records](//github.com/tc39/proposal-record-tuple) will make this easily usable with [Maps](//developer.mozilla.org/docs/Web/JavaScript/Reference/Global_Objects/Map). – Sebastian Simon Feb 21 '22 at 08:16
45

To remove all duplicates from an array of objects, the simplest way is use filter:

var uniq = {};
var arr  = [{"id":"1"},{"id":"1"},{"id":"2"}];
var arrFiltered = arr.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
console.log('arrFiltered', arrFiltered);
аlex
  • 5,426
  • 1
  • 29
  • 38
  • 12
    It's good practice on Stack Overflow to add an explanation as to why your solution should work, especially how yours is better than the other answers. For more information read [How To Answer](//stackoverflow.com/help/how-to-answer). – Samuel Liew Sep 15 '18 at 05:14
  • 1
    This does not answer the original question as this is searches for an `id`. The question needs the entire object to be unique across all fields such as `place` and `name` – L. Holanda Dec 10 '19 at 18:47
35

One liners with Map ( High performance, Does not preserve order )

Find unique id's in array arr.

const arrUniq = [...new Map(arr.map(v => [v.id, v])).values()]

If the order is important check out the solution with filter: Solution with filter


Unique by multiple properties ( place and name ) in array arr

const arrUniq = [...new Map(arr.map(v => [JSON.stringify([v.place,v.name]), v])).values()]

Unique by all properties in array arr

const arrUniq = [...new Map(arr.map(v => [JSON.stringify(v), v])).values()]

Keep the first occurrence in array arr

const arrUniq = [...new Map(arr.slice().reverse().map(v => [v.id, v])).values()].reverse()
chickens
  • 19,976
  • 6
  • 58
  • 55
30

Here's another option to do it using Array iterating methods if you need comparison only by one field of an object:

    function uniq(a, param){
        return a.filter(function(item, pos, array){
            return array.map(function(mapItem){ return mapItem[param]; }).indexOf(item[param]) === pos;
        })
    }

    uniq(things.thing, 'place');
Alex Kobylinski
  • 329
  • 3
  • 6
  • Although this has an order greater than O(n²), this fits my use case because my array size will always be less than 30. Thanks! – Sterex Jul 13 '16 at 11:44
26

This is a generic way of doing this: you pass in a function that tests whether two elements of an array are considered equal. In this case, it compares the values of the name and place properties of the two objects being compared.

ES5 answer

function removeDuplicates(arr, equals) {
    var originalArr = arr.slice(0);
    var i, len, val;
    arr.length = 0;

    for (i = 0, len = originalArr.length; i < len; ++i) {
        val = originalArr[i];
        if (!arr.some(function(item) { return equals(item, val); })) {
            arr.push(val);
        }
    }
}

function thingsEqual(thing1, thing2) {
    return thing1.place === thing2.place
        && thing1.name === thing2.name;
}

var things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];

removeDuplicates(things, thingsEqual);
console.log(things);

Original ES3 answer

function arrayContains(arr, val, equals) {
    var i = arr.length;
    while (i--) {
        if ( equals(arr[i], val) ) {
            return true;
        }
    }
    return false;
}

function removeDuplicates(arr, equals) {
    var originalArr = arr.slice(0);
    var i, len, j, val;
    arr.length = 0;

    for (i = 0, len = originalArr.length; i < len; ++i) {
        val = originalArr[i];
        if (!arrayContains(arr, val, equals)) {
            arr.push(val);
        }
    }
}

function thingsEqual(thing1, thing2) {
    return thing1.place === thing2.place
        && thing1.name === thing2.name;
}

removeDuplicates(things.thing, thingsEqual);
DevDavid
  • 195
  • 1
  • 1
  • 18
Tim Down
  • 318,141
  • 75
  • 454
  • 536
  • 1
    Two objects won't evaluate equal, even if they share the same properties and values. – kennebec Feb 08 '10 at 04:06
  • Yes, I know. But fair point, I've failed to read the question correctly: I hadn't spotted that it was objects with identical properties he needed to weed out. I'll edit my answer. – Tim Down Feb 08 '10 at 09:14
  • 1
    instead of while inside arrayContains- use Array.prototype..some method Returns true if one of array members match condition – MarkosyanArtur Mar 05 '18 at 06:29
25

If you can wait to eliminate the duplicates until after all the additions, the typical approach is to first sort the array and then eliminate duplicates. The sorting avoids the N * N approach of scanning the array for each element as you walk through them.

The "eliminate duplicates" function is usually called unique or uniq. Some existing implementations may combine the two steps, e.g., prototype's uniq

This post has few ideas to try (and some to avoid :-) ) if your library doesn't already have one! Personally I find this one the most straight forward:

    function unique(a){
        a.sort();
        for(var i = 1; i < a.length; ){
            if(a[i-1] == a[i]){
                a.splice(i, 1);
            } else {
                i++;
            }
        }
        return a;
    }  

    // Provide your own comparison
    function unique(a, compareFunc){
        a.sort( compareFunc );
        for(var i = 1; i < a.length; ){
            if( compareFunc(a[i-1], a[i]) === 0){
                a.splice(i, 1);
            } else {
                i++;
            }
        }
        return a;
    }
h2ooooooo
  • 39,111
  • 8
  • 68
  • 102
maccullt
  • 2,769
  • 1
  • 18
  • 15
  • That won't work for generic objects without a natural sort order. – Tim Down Feb 08 '10 at 09:28
  • True, I added a user-supplied comparison version. – maccullt Feb 08 '10 at 10:57
  • Your user-supplied comparison version won't work because if your comparison function is `function(_a,_b){return _a.a===_b.a && _a.b===_b.b;}` then the array won't be sorted. – graham.reeds Mar 25 '10 at 06:02
  • 1
    That is an invalid compare function. From https://developer.mozilla.org/en/Core_JavaScript_1.5_Reference/Global_Objects/Array/sort ... function compare(a, b) { if (a is less than b by some ordering criterion) return -1; if (a is greater than b by the ordering criterion) return 1; // a must be equal to b return 0; } ... – maccullt Mar 25 '10 at 17:01
22

I think the best approach is using reduce and Map object. This is a single line solution.

const data = [
  {id: 1, name: 'David'},
  {id: 2, name: 'Mark'},
  {id: 2, name: 'Lora'},
  {id: 4, name: 'Tyler'},
  {id: 4, name: 'Donald'},
  {id: 5, name: 'Adrian'},
  {id: 6, name: 'Michael'}
]

const uniqueData = [...data.reduce((map, obj) => map.set(obj.id, obj), new Map()).values()];

console.log(uniqueData)

/*
  in `map.set(obj.id, obj)`
  
  'obj.id' is key. (don't worry. we'll get only values using the .values() method)
  'obj' is whole object.
*/
doğukan
  • 23,073
  • 13
  • 57
  • 69
16

Considering lodash.uniqWith

const objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
 
_.uniqWith(objects, _.isEqual);
// => [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }]
leonheess
  • 16,068
  • 14
  • 77
  • 112
Justin
  • 4,400
  • 2
  • 32
  • 36
  • 1
    Neither lodash's uniq nor uniqBy did the trick, but your solution did. Thanks! Please give the source of your code however, if it's a direct copy. https://lodash.com/docs/4.17.10#uniqWith – Manu CJ Jun 18 '18 at 08:21
  • works perfect for me this solution – Mamé Jan 24 '22 at 14:58
16

To add one more to the list. Using ES6 and Array.reduce with Array.find.
In this example filtering objects based on a guid property.

let filtered = array.reduce((accumulator, current) => {
  if (! accumulator.find(({guid}) => guid === current.guid)) {
    accumulator.push(current);
  }
  return accumulator;
}, []);

Extending this one to allow selection of a property and compress it into a one liner:

const uniqify = (array, key) => array.reduce((prev, curr) => prev.find(a => a[key] === curr[key]) ? prev : prev.push(curr) && prev, []);

To use it pass an array of objects and the name of the key you wish to de-dupe on as a string value:

const result = uniqify(myArrayOfObjects, 'guid')
Stevo
  • 2,601
  • 3
  • 24
  • 32
Pete B
  • 1,709
  • 18
  • 11
15

You could also use a Map:

const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());

Full sample:

const things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

const dedupThings = Array.from(things.thing.reduce((m, t) => m.set(t.place, t), new Map()).values());

console.log(JSON.stringify(dedupThings, null, 4));

Result:

[
    {
        "place": "here",
        "name": "stuff"
    },
    {
        "place": "there",
        "name": "morestuff"
    }
]
Pragmateek
  • 13,174
  • 9
  • 74
  • 108
15

Dang, kids, let's crush this thing down, why don't we?

let uniqIds = {}, source = [{id:'a'},{id:'b'},{id:'c'},{id:'b'},{id:'a'},{id:'d'}];
let filtered = source.filter(obj => !uniqIds[obj.id] && (uniqIds[obj.id] = true));
console.log(filtered);
// EXPECTED: [{id:'a'},{id:'b'},{id:'c'},{id:'d'}];
Cliff Hall
  • 723
  • 6
  • 13
  • 1
    This does not answer the original question as this is searches for an `id`. The question needs the entire object to be unique across all fields such as `place` and `name` – L. Holanda Dec 10 '19 at 18:55
  • 2
    This is a refinement of an above generalization of the problem. The original question was posted 9 years ago, so the original poster probably isn't worried about `place` and `name` today. Anyone reading this thread is looking for an optimal way to dedup a list of objects, and this is a compact way of doing so. – Cliff Hall Dec 11 '19 at 19:25
15

let myData = [{place:"here",name:"stuff"}, 
 {place:"there",name:"morestuff"},
 {place:"there",name:"morestuff"}];


let q = [...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];

console.log(q)

One-liner using ES6 and new Map().

// assign things.thing to myData
let myData = things.thing;

[...new Map(myData.map(obj => [JSON.stringify(obj), obj])).values()];

Details:-

  1. Doing .map() on the data list and converting each individual object into a [key, value] pair array(length =2), the first element(key) would be the stringified version of the object and second(value) would be an object itself.
  2. Adding above created array list to new Map() would have the key as stringified object and any same key addition would result in overriding the already existing key.
  3. Using .values() would give MapIterator with all values in a Map (obj in our case)
  4. Finally, spread ... operator to give new Array with values from the above step.
Savan Akbari
  • 460
  • 1
  • 6
  • 13
14

A TypeScript solution

This will remove duplicate objects and also preserve the types of the objects.

function removeDuplicateObjects(array: any[]) {
  return [...new Set(array.map(s => JSON.stringify(s)))]
    .map(s => JSON.parse(s));
}
12

 const things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];
const filteredArr = things.reduce((thing, current) => {
  const x = thing.find(item => item.place === current.place);
  if (!x) {
    return thing.concat([current]);
  } else {
    return thing;
  }
}, []);
console.log(filteredArr)

Solution Via Set Object | According to the data type

const seen = new Set();
 const things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];

const filteredArr = things.filter(el => {
  const duplicate = seen.has(el.place);
  seen.add(el.place);
  return !duplicate;
});
console.log(filteredArr)

Set Object Feature

Each value in the Set Object has to be unique, the value equality will be checked

The Purpose of Set object storing unique values according to the Data type , whether primitive values or object references.it has very useful four Instance methods add, clear , has & delete.

Unique & data Type feature:..

addmethod

it's push unique data into collection by default also preserve data type .. that means it prevent to push duplicate item into collection also it will check data type by default...

has method

sometime needs to check data item exist into the collection and . it's handy method for the collection to cheek unique id or item and data type..

delete method

it will remove specific item from the collection by identifying data type..

clear method

it will remove all collection items from one specific variable and set as empty object

Set object has also Iteration methods & more feature..

Better Read from Here : Set - JavaScript | MDN

نور
  • 1,425
  • 2
  • 22
  • 38
11

removeDuplicates() takes in an array of objects and returns a new array without any duplicate objects (based on the id property).

const allTests = [
  {name: 'Test1', id: '1'}, 
  {name: 'Test3', id: '3'},
  {name: 'Test2', id: '2'},
  {name: 'Test2', id: '2'},
  {name: 'Test3', id: '3'}
];

function removeDuplicates(array) {
  let uniq = {};
  return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true))
}

removeDuplicates(allTests);

Expected outcome:

[
  {name: 'Test1', id: '1'}, 
  {name: 'Test3', id: '3'},
  {name: 'Test2', id: '2'}
];

First, we set the value of variable uniq to an empty object.

Next, we filter through the array of objects. Filter creates a new array with all elements that pass the test implemented by the provided function.

return array.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));

Above, we use the short-circuiting functionality of &&. If the left side of the && evaluates to true, then it returns the value on the right of the &&. If the left side is false, it returns what is on the left side of the &&.

For each object(obj) we check uniq for a property named the value of obj.id (In this case, on the first iteration it would check for the property '1'.) We want the opposite of what it returns (either true or false) which is why we use the ! in !uniq[obj.id]. If uniq has the id property already, it returns true which evaluates to false (!) telling the filter function NOT to add that obj. However, if it does not find the obj.id property, it returns false which then evaluates to true (!) and returns everything to the right of the &&, or (uniq[obj.id] = true). This is a truthy value, telling the filter method to add that obj to the returned array, and it also adds the property {1: true} to uniq. This ensures that any other obj instance with that same id will not be added again.

leonheess
  • 16,068
  • 14
  • 77
  • 112
MarkN
  • 139
  • 1
  • 4
11

Fast (less runtime) and type-safe answer for lazy Typescript developers:

export const uniqueBy = <T>( uniqueKey: keyof T, objects: T[]): T[] => {
  const ids = objects.map(object => object[uniqueKey]);
  return objects.filter((object, index) => !ids.includes(object[uniqueKey], index + 1));
} 
Masih Jahangiri
  • 9,489
  • 3
  • 45
  • 51
9

This way works well for me:

function arrayUnique(arr, uniqueKey) {
  const flagList = new Set()
  return arr.filter(function(item) {
    if (!flagList.has(item[uniqueKey])) {
      flagList.add(item[uniqueKey])
      return true
    }
  })
}
const data = [
  {
    name: 'Kyle',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Kyle',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Emily',
    occupation: 'Web Designer'
  },
  {
    name: 'Melissa',
    occupation: 'Fashion Designer'
  },
  {
    name: 'Tom',
    occupation: 'Web Developer'
  },
  {
    name: 'Tom',
    occupation: 'Web Developer'
  }
]
console.table(arrayUnique(data, 'name'))// work well

printout

┌─────────┬───────────┬────────────────────┐
│ (index) │   name    │     occupation     │
├─────────┼───────────┼────────────────────┤
│    0    │  'Kyle'   │ 'Fashion Designer' │
│    1    │  'Emily'  │   'Web Designer'   │
│    2    │ 'Melissa' │ 'Fashion Designer' │
│    3    │   'Tom'   │  'Web Developer'   │
└─────────┴───────────┴────────────────────┘

ES5:

function arrayUnique(arr, uniqueKey) {
  const flagList = []
  return arr.filter(function(item) {
    if (flagList.indexOf(item[uniqueKey]) === -1) {
      flagList.push(item[uniqueKey])
      return true
    }
  })
}

These two ways are simpler and more understandable.

leonheess
  • 16,068
  • 14
  • 77
  • 112
JackChouMine
  • 947
  • 8
  • 22
8

Here is a solution for ES6 where you only want to keep the last item. This solution is functional and Airbnb style compliant.

const things = {
  thing: [
    { place: 'here', name: 'stuff' },
    { place: 'there', name: 'morestuff1' },
    { place: 'there', name: 'morestuff2' }, 
  ],
};

const removeDuplicates = (array, key) => {
  return array.reduce((arr, item) => {
    const removed = arr.filter(i => i[key] !== item[key]);
    return [...removed, item];
  }, []);
};

console.log(removeDuplicates(things.thing, 'place'));
// > [{ place: 'here', name: 'stuff' }, { place: 'there', name: 'morestuff2' }]
leonheess
  • 16,068
  • 14
  • 77
  • 112
Micah
  • 97
  • 1
  • 3
  • You can remove the duplicate and you can also remove all the duplicate with this code. Nice – sg28 May 23 '19 at 18:02
7

If array contains objects, then you can use this to remove duplicate

const persons= [
      { id: 1, name: 'John',phone:'23' },
      { id: 2, name: 'Jane',phone:'23'},
      { id: 1, name: 'Johnny',phone:'56' },
      { id: 4, name: 'Alice',phone:'67' },
    ];
const unique = [...new Map(persons.map((m) => [m.id, m])).values()];

if remove duplicates on the basis of phone, just replace m.id with m.phone

const unique = [...new Map(persons.map((m) => [m.phone, m])).values()];
Ghias Ali
  • 257
  • 2
  • 10
6

I know there is a ton of answers in this question already, but bear with me...

Some of the objects in your array may have additional properties that you are not interested in, or you simply want to find the unique objects considering only a subset of the properties.

Consider the array below. Say you want to find the unique objects in this array considering only propOne and propTwo, and ignore any other properties that may be there.

The expected result should include only the first and last objects. So here goes the code:

const array = [{
    propOne: 'a',
    propTwo: 'b',
    propThree: 'I have no part in this...'
},
{
    propOne: 'a',
    propTwo: 'b',
    someOtherProperty: 'no one cares about this...'
},
{
    propOne: 'x',
    propTwo: 'y',
    yetAnotherJunk: 'I am valueless really',
    noOneHasThis: 'I have something no one has'
}];

const uniques = [...new Set(
    array.map(x => JSON.stringify(((o) => ({
        propOne: o.propOne,
        propTwo: o.propTwo
    }))(x))))
].map(JSON.parse);

console.log(uniques);
Sнаđошƒаӽ
  • 16,753
  • 12
  • 73
  • 90
  • It works but the other properties will be cleared, is it possible to keep the rest properties of the selected object? – Thanwa Ch. Sep 19 '20 at 11:43
  • @ThanwaCh. That's doable, and it is a matter of preference really - just need to determine which object the rest of the properties should be taken from in case of duplicates. Using my example, first and second objects in the `array` become one in the `uniques`. Now should that object contain `propThree` from `array[0]`, or `someOtherProperty` from `array[1]`, or both, or something else? As long as we know exactly what to do in such case, what you asked for is doable for sure. – Sнаđошƒаӽ Sep 19 '20 at 12:05
  • This solution worked beautifully for the use case I was coding. Can you explain what this part is/does `(({ propOne, propTwo }) => ({ propOne, propTwo }))(x)`? – knot22 May 26 '21 at 14:12
  • 1
    @knot22 the part before `(x)` is an arrow function which is unpacking the argument object into properties `propOne` and `propTwo`. Learn about object destructuring [here](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Destructuring_assignment#object_destructuring). Now that I have read the code again, I think it should have been written a little more clearly. I have updated the code. – Sнаđошƒаӽ May 26 '21 at 15:12
5

Another option would be to create a custom indexOf function, which compares the values of your chosen property for each object and wrap this in a reduce function.

var uniq = redundant_array.reduce(function(a,b){
      function indexOfProperty (a, b){
          for (var i=0;i<a.length;i++){
              if(a[i].property == b.property){
                   return i;
               }
          }
         return -1;
      }

      if (indexOfProperty(a,b) < 0 ) a.push(b);
        return a;
    },[]);
ZeroSum
  • 165
  • 2
  • 4
  • this worked out great for me - I paired this with [`lodash.isequal` npm package](https://www.npmjs.com/package/lodash.isequal) as a lightweight object comparator to perform unique array filtering ...e.g. distinct array of objects. Just swapped in `if (_.isEqual(a[i], b)) {` instead of looking @ a single property – SliverNinja - MSFT Nov 29 '17 at 17:40
3

This solution worked best for me , by utilising Array.from Method, And also its shorter and readable.

let person = [
{name: "john"}, 
{name: "jane"}, 
{name: "imelda"}, 
{name: "john"},
{name: "jane"}
];

const data = Array.from(new Set(person.map(JSON.stringify))).map(JSON.parse);
console.log(data);
3

Here I found a simple solution for removing duplicates from an array of objects using reduce method. I am filtering elements based on the position key of an object

const med = [
  {name: 'name1', position: 'left'},
  {name: 'name2', position: 'right'},
  {name: 'name3', position: 'left'},
  {name: 'name4', position: 'right'},
  {name: 'name5', position: 'left'},
  {name: 'name6', position: 'left1'}
]

const arr = [];
med.reduce((acc, curr) => {
  if(acc.indexOf(curr.position) === -1) {
    acc.push(curr.position);
    arr.push(curr);
  }
  return acc;
}, [])

console.log(arr)
Afeesudheen
  • 936
  • 2
  • 12
  • 21
2

Continuing exploring ES6 ways of removing duplicates from array of objects: setting thisArg argument of Array.prototype.filter to new Set provides a decent alternative:

const things = [
  {place:"here",name:"stuff"},
  {place:"there",name:"morestuff"},
  {place:"there",name:"morestuff"}
];

const filtered = things.filter(function({place, name}) {

  const key =`${place}${name}`;

  return !this.has(key) && this.add(key);

}, new Set);

console.log(filtered);

However, it will not work with arrow functions () =>, as this is bound to their lexical scope.

Leonid Pyrlia
  • 1,594
  • 2
  • 11
  • 14
2

es6 magic in one line... readable at that!

// returns the union of two arrays where duplicate objects with the same 'prop' are removed
const removeDuplicatesWith = (a, b, prop) => {
  a.filter(x => !b.find(y => x[prop] === y[prop]));
};
doğukan
  • 23,073
  • 13
  • 57
  • 69
Josiah Coad
  • 325
  • 4
  • 11
  • this doesn't work in two ways. Not only did the two people that edited the original solution fundamentally change it by adding braces but not adding a return statement before the `a.filter`, the original function didnt work anyway since it ignores any items in the second array that are not in the first. – MarkC Feb 24 '21 at 22:29
2

Source

JSFiddle

This will remove the duplicate object without passing any key.

uniqueArray = a => [...new Set(a.map(o => JSON.stringify(o)))].map(s => JSON.parse(s));

var objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];

var unique = uniqueArray(objects);
console.log('Original Object',objects);
console.log('Unique',unique);
uniqueArray = a => [...new Set(a.map(o => JSON.stringify(o)))].map(s => JSON.parse(s));

    var objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];

    var unique = uniqueArray(objects);
    console.log(objects);
    console.log(unique);
Suhail Akhtar
  • 1,718
  • 15
  • 29
  • Could you explain what's happening and how it's working? – MillerC Mar 03 '20 at 19:28
  • 1
    I played with it and from what I can see, the [...new Set(a.map(o => JSON.stringify(o))) returns the dup free values because a new Set removes duplicates. The set has to have the (o) stringified so it can correctly remove duplicate "strings", and then the following .map is converting the stringified values back into objects. – MillerC Mar 03 '20 at 19:51
2

If you strictly want to remove duplicates based on one property, you can reduce the array into and object based on the place property, since the object can only have unique keys, you can then just get the values to get back to an array:

const unique = Object.values(things.thing.reduce((o, t) => ({ ...o, [t.place]: t }), {}))
crmackey
  • 349
  • 1
  • 5
  • 20
2

I believe a combination of reduce with JSON.stringify to perfectly compare Objects and selectively adding those who are not already in the accumulator is an elegant way.

Keep in mind that JSON.stringify might become a performance issue in extreme cases where the array has many Objects and they are complex, BUT for majority of the time, this is the shortest way to go IMHO.

var collection= [{a:1},{a:2},{a:1},{a:3}]

var filtered = collection.reduce((filtered, item) => {
  if( !filtered.some(filteredItem => JSON.stringify(filteredItem) == JSON.stringify(item)) )
    filtered.push(item)
  return filtered
}, [])

console.log(filtered)

Another way of writing the same (but less efficient):

collection.reduce((filtered, item) => 
  filtered.some(filteredItem => 
    JSON.stringify(filteredItem ) == JSON.stringify(item)) 
      ? filtered
      : [...filtered, item]
, [])
vsync
  • 118,978
  • 58
  • 307
  • 400
2
 npm i lodash

 let non_duplicated_data = _.uniqBy(pendingDeposits, v => [v.stellarAccount, v.externalTransactionId].join());
azrahel
  • 1,143
  • 2
  • 13
  • 31
Muhammad Waqas
  • 511
  • 6
  • 9
2

The problem can be simplified to removing duplicates from the thing array.

You can implement a faster O(n) solution (assuming native key lookup is negligible) by using an object to both maintain unique criteria as keys and storing associated values.

Basically, the idea is to store all objects by their unique key, so that duplicates overwrite themselves:

const thing = [{ place: "here", name:"stuff" }, { place: "there", name:"morestuff" }, { place: "there", name:"morestuff" } ]

const uniques = {}
for (const t of thing) {
  const key = t.place + '$' + t.name  // Or whatever string criteria you want, which can be generified as Object.keys(t).join("$")
  uniques[key] = t                    // Last duplicate wins
}
const uniqueThing = Object.values(uniques)
console.log(uniqueThing)
Jérôme Beau
  • 10,608
  • 5
  • 48
  • 52
2
const objectsMap = new Map();
const placesName = [
  { place: "here", name: "stuff" },
  { place: "there", name: "morestuff" },
  { place: "there", name: "morestuff" },
];
placesName.forEach((object) => {
  objectsMap.set(object.place, object);
});
console.log(objectsMap);
ENcy
  • 324
  • 4
  • 7
1
let data = [
  {
    'name': 'Amir',
    'surname': 'Rahnama'
  }, 
  {
    'name': 'Amir',
    'surname': 'Stevens'
  }
];
let non_duplicated_data = _.uniqBy(data, 'name');
nwxdev
  • 4,194
  • 3
  • 16
  • 22
qzttt
  • 93
  • 1
  • 4
  • 14
    Please add an explanation around your code so future visitors can understand what it is you are doing. Thanks. – Bugs Jun 20 '17 at 08:06
  • Sorry...what is this code doing? how did you get `_.uniqBy` ? is this javascript ? – jovialcore May 22 '21 at 15:54
1

If you don't mind your unique array being sorted afterwards, this would be an efficient solution:

things.thing
  .sort(((a, b) => a.place < b.place)
  .filter((current, index, array) =>
    index === 0 || current.place !== array[index - 1].place)

This way, you only have to compare the current element with the previous element in the array. Sorting once before filtering (O(n*log(n))) is cheaper than searching for a duplicate in the entire array for every array element (O(n²)).

Clemens Helm
  • 3,841
  • 1
  • 22
  • 13
  • [Sorting in JavaScript: Shouldn't returning a boolean be enough for a comparison function?](https://stackoverflow.com/q/24080785) – adiga Oct 17 '20 at 09:04
1
str =[
{"item_id":1},
{"item_id":2},
{"item_id":2}
]

obj =[]
for (x in str){
    if(check(str[x].item_id)){
        obj.push(str[x])
    }   
}
function check(id){
    flag=0
    for (y in obj){
        if(obj[y].item_id === id){
            flag =1
        }
    }
    if(flag ==0) return true
    else return false

}
console.log(obj)

str is an array of objects. There exists objects having same value (here a small example, there are two objects having same item_id as 2). check(id) is a function that checks if any object having same item_id exists or not. if it exists return false otherwise return true. According to that result, push the object into a new array obj The output of the above code is [{"item_id":1},{"item_id":2}]

Bibin Jaimon
  • 534
  • 1
  • 6
  • 15
1

Have you heard of Lodash library? I recommend you this utility, when you don't really want to apply your logic to the code, and use already present code which is optimised and reliable.

Consider making an array like this

things.thing.push({place:"utopia",name:"unicorn"});
things.thing.push({place:"jade_palace",name:"po"});
things.thing.push({place:"jade_palace",name:"tigress"});
things.thing.push({place:"utopia",name:"flying_reindeer"});
things.thing.push({place:"panda_village",name:"po"});

Note that if you want to keep one attribute unique, you may very well do that by using lodash library. Here, you may use _.uniqBy

.uniqBy(array, [iteratee=.identity])

This method is like _.uniq (which returns a duplicate-free version of an array, in which only the first occurrence of each element is kept) except that it accepts iteratee which is invoked for each element in array to generate the criterion by which uniqueness is computed.

So, for example, if you want to return an array having unique attribute of 'place'

_.uniqBy(things.thing, 'place')

Similarly, if you want unique attribute as 'name'

_.uniqBy(things.thing, 'name')

Hope this helps.

Cheers!

Mayank Gangwal
  • 519
  • 3
  • 8
1
  • This solution is generic for any kind of object and checks for every (key, value) of the Object in the array.
  • Using an temporary object as a hash table to see if the entire Object was ever present as a key.
  • If the string representation of the Object is found then that item is removed from the array.

var arrOfDup = [{'id':123, 'name':'name', 'desc':'some desc'},
                {'id':125, 'name':'another name', 'desc':'another desc'},
                {'id':123, 'name':'name', 'desc':'some desc'},
                {'id':125, 'name':'another name', 'desc':'another desc'},
                {'id':125, 'name':'another name', 'desc':'another desc'}];

function removeDupes(dupeArray){
  let temp = {};
  let tempArray = JSON.parse(JSON.stringify(dupeArray));
  dupeArray.forEach((item, pos) => {
    if(temp[JSON.stringify(item)]){
      tempArray.pop();
    }else{
      temp[JSON.stringify(item)] = item;
    }
  });
 return tempArray;
}

arrOfDup = removeDupes(arrOfDup);

arrOfDup.forEach((item, pos) => {
  console.log(`item in array at position ${pos} is ${JSON.stringify(item)}`);
});
Fullstack Guy
  • 16,368
  • 3
  • 29
  • 44
1
const uniqueElements = (arr, fn) => arr.reduce((acc, v) => {
    if (!acc.some(x => fn(v, x))) { acc.push(v); }
    return acc;
}, []);

const stuff = [
    {place:"here",name:"stuff"},
    {place:"there",name:"morestuff"},
    {place:"there",name:"morestuff"},
];

const unique = uniqueElements(stuff, (a,b) => a.place === b.place && a.name === b.name );
//console.log( unique );

[{
    "place": "here",
    "name": "stuff"
  },
  {
    "place": "there",
    "name": "morestuff"
}]
wLc
  • 968
  • 12
  • 15
1

You can convert the array objects into strings so they can be compared, add the strings to a Set so the comparable duplicates will be automatically removed and then convert each of the strings back into objects.

It might not be as performant as other answers, but it's readable.

const things = {};

things.thing = [];
things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

const uniqueArray = (arr) => {

  const stringifiedArray = arr.map((item) => JSON.stringify(item));
  const set = new Set(stringifiedArray);

  return Array.from(set).map((item) => JSON.parse(item));
}

const uniqueThings = uniqueArray(things.thing);

console.log(uniqueThings);
Chunky Chunk
  • 16,553
  • 15
  • 84
  • 162
1

Simple solution with ES6 'reduce' and 'find' array helper methods

Works efficiently and perfectly fine!

"use strict";

var things = new Object();
things.thing = new Array();
things.thing.push({
    place: "here",
    name: "stuff"
});
things.thing.push({
    place: "there",
    name: "morestuff"
});
things.thing.push({
    place: "there",
    name: "morestuff"
});

// the logic is here

function removeDup(something) {
    return something.thing.reduce(function (prev, ele) {
        var found = prev.find(function (fele) {
            return ele.place === fele.place && ele.name === fele.name;
        });
        if (!found) {
            prev.push(ele);
        }
        return prev;
    }, []);
}
console.log(removeDup(things));
KJ Sudarshan
  • 2,694
  • 1
  • 29
  • 22
1

For a readable and a simple solution searcher, her is my version:

    function removeDupplicationsFromArrayByProp(originalArray, prop) {
        let results = {};
        for(let i=0; i<originalArray.length;i++){
            results[originalArray[i][prop]] = originalArray[i];
        }
        return Object.values(results);
    }
TBE
  • 1,002
  • 1
  • 11
  • 32
1

My two cents here. If you know the properties are in the same order, you can stringify the elements and remove dupes from the array and parse the array again. Something like this:

var things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
  
let stringified = things.thing.map(i=>JSON.stringify(i));
let unique =  stringified.filter((k, idx)=> stringified.indexOf(k) === idx)
                         .map(j=> JSON.parse(j))
console.log(unique);
ABGR
  • 4,631
  • 4
  • 27
  • 49
1
  • Removing Duplicates From Array Of Objects in react js (Working perfectly)

      let optionList = [];
          var dataArr = this.state.itemArray.map(item => {
              return [item.name, item]
          });
      var maparr = new Map(dataArr);
    
      var results = [...maparr.values()];
    
      if (results.length > 0) {
           results.map(data => {
           if (data.lead_owner !== null) {
                optionList.push({ label: data.name, value: 
                data.name });
           }
           return true;
         });
     }
     console.log(optionList)
    
Kishor Ahir
  • 146
  • 7
1

You could use Set along with Filter method to accomplish this,

var arrObj = [{
  a: 1,
  b: 2
}, {
  a: 1,
  b: 1
}, {
  a: 1,
  b: 2
}];

var duplicateRemover = new Set();

var distinctArrObj = arrObj.filter((obj) => {
  if (duplicateRemover.has(JSON.stringify(obj))) return false;
  duplicateRemover.add(JSON.stringify(obj));
  return true;
});

console.log(distinctArrObj);

Set is a unique collection of primitive types, thus, won't work directly on objects, however JSON.stringify will convert it into a primitive type ie. String thus, we can filter.

If you want to remove duplicates based on only some particular key, for eg. key, you could replace JSON.stringify(obj) with obj.key

1

This is a single loop approach with a Set and some closures to prevent using declared variables outside function declarations and to get a short appearance.

const
    array = [{ place: "here", name: "stuff", n: 1 }, { place: "there", name: "morestuff", n: 2 }, { place: "there", name: "morestuff", n: 3 }],
    keys = ['place', 'name'],
    unique = array.filter(
        (s => o => (v => !s.has(v) && s.add(v))(keys.map(k => o[k]).join('|')))
        (new Set)
    );

console.log(unique);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Nina Scholz
  • 376,160
  • 25
  • 347
  • 392
1

work for me

const uniqueArray = products.filter( (value,index) => {
  return index === products.findIndex( (obj) => { 
    return JSON.stringify(obj) === JSON.stringify(value);
  }) 
})
jonathasborges1
  • 2,351
  • 2
  • 10
  • 17
0

Here is another technique to find number of duplicate and and remove it easily from you data object. "dupsCount" is number of duplicate files count. sort your data first then remove. it will gives you fastest duplication remove.

  dataArray.sort(function (a, b) {
            var textA = a.name.toUpperCase();
            var textB = b.name.toUpperCase();
            return (textA < textB) ? -1 : (textA > textB) ? 1 : 0;
        });
        for (var i = 0; i < dataArray.length - 1; ) {
            if (dataArray[i].name == dataArray[i + 1].name) {
                dupsCount++;
                dataArray.splice(i, 1);
            } else {
                i++;
            }
        }
HD..
  • 1,456
  • 3
  • 28
  • 48
0

If you need an unique array based on multiple properties in the object you can do this with map and combining the properties of the object.

    var hash = array.map(function(element){
        var string = ''
        for (var key in element){
            string += element[key]
        }
        return string
    })
    array = array.filter(function(element, index){
        var string = ''
        for (var key in element){
            string += element[key]
        }
        return hash.indexOf(string) == index
    })
0

Generic for any array of objects:

/**
* Remove duplicated values without losing information
*/
const removeValues = (items, key) => {
  let tmp = {};

  items.forEach(item => {
    tmp[item[key]] = (!tmp[item[key]]) ? item : Object.assign(tmp[item[key]], item);
  });
  items = [];
  Object.keys(tmp).forEach(key => items.push(tmp[key]));

  return items;
}

Hope it could help to anyone.

aSoler
  • 145
  • 1
  • 9
0

Another way would be to use reduce function and have a new array to be the accumulator. If there is already a thing with the same name in the accumulator array then don't add it there.

let list = things.thing;
list = list.reduce((accumulator, thing) => {
    if (!accumulator.filter((duplicate) => thing.name === duplicate.name)[0]) {
        accumulator.push(thing);
    }
    return accumulator;
}, []);
thing.things = list;

I'm adding this answer, because I couldn't find nice, readable es6 solution (I use babel to handle arrow functions) that's compatible with Internet Explorer 11. The problem is IE11 doesn't have Map.values() or Set.values() without polyfill. For the same reason I used filter()[0] to get first element instead of find().

Keammoort
  • 3,075
  • 15
  • 20
0
 var testArray= ['a','b','c','d','e','b','c','d'];

 function removeDuplicatesFromArray(arr){

 var obj={};
 var uniqueArr=[];
 for(var i=0;i<arr.length;i++){ 
    if(!obj.hasOwnProperty(arr[i])){
        obj[arr[i]] = arr[i];
        uniqueArr.push(arr[i]);
    }
 }

return uniqueArr;

}
var newArr = removeDuplicatesFromArray(testArray);
console.log(newArr);

Output:- [ 'a', 'b', 'c', 'd', 'e' ]
0

If you don't want to specify a list of properties:

function removeDuplicates(myArr) {
  var props = Object.keys(myArr[0])
  return myArr.filter((item, index, self) =>
    index === self.findIndex((t) => (
      props.every(prop => {
        return t[prop] === item[prop]
      })
    ))
  )
}

OBS! Not compatible with IE11.

0

here is my solution, it searches for duplicates based on object.prop and when it finds a duplicate object it replaces its value in array1 with array2 value

function mergeSecondArrayIntoFirstArrayByProperty(array1, array2) {
    for (var i = 0; i < array2.length; i++) {
        var found = false;
        for (var j = 0; j < array1.length; j++) {
            if (array2[i].prop === array1[j].prop) { // if item exist in array1
                array1[j] = array2[i]; // replace it in array1 with array2 value
                found = true;
            }
        }
        if (!found) // if item in array2 not found in array1, add it to array1
            array1.push(array2[i]);

    }
    return array1;
}
Basheer AL-MOMANI
  • 14,473
  • 9
  • 96
  • 92
0

What about this:

function dedupe(arr, compFn){
    let res = [];
    if (!compFn) compFn = (a, b) => { return a === b };
    arr.map(a => {if(!res.find(b => compFn(a, b))) res.push(a)});
    return res;
}
zipper
  • 377
  • 1
  • 5
  • 18
0

If you find yourself needing to remove duplicate objects from arrays based on particular fields frequently, it might be worth creating a distinct(array, predicate) function that you can import from anywhere in your project. This would look like

const things = [{place:"here",name:"stuff"}, ...];
const distinctThings = distinct(things, thing => thing.place);

The distinct function can use any of the implementations given in the many good answers above. The easiest one uses findIndex:

const distinct = (items, predicate) => items.filter((uniqueItem, index) =>
    items.findIndex(item =>
        predicate(item) === predicate(uniqueItem)) === index);
0

You can use Object.values() combined with Array.prototype.reduce():

const things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});

const result = Object.values(things.thing.reduce((a, c) => (a[`${c.place}${c.name}`] = c, a), {})); 

console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Yosvel Quintero
  • 18,669
  • 5
  • 37
  • 46
0

Make Something simple. Fancy is good but unreadable code is useless. Enjoy :-)

var a = [
 {
  executiveId: 6873702,
  largePhotoCircle: null,
  name: "John A. Cuomo",
  photoURL: null,
  primaryCompany: "VSE CORP",
  primaryTitle: "Chief Executive Officer, President and Director"
 },
 {
  executiveId: 6873702,
  largePhotoCircle: null,
  name: "John A. Cuomo",
  photoURL: null,
  primaryCompany: "VSE CORP",
  primaryTitle: "Chief Executive Officer, President and Director"
 },
 {
  executiveId: 6873703,
  largePhotoCircle: null,
  name: "John A. Cuomo",
  photoURL: null,
  primaryCompany: "VSE CORP",
  primaryTitle: "Chief Executive Officer, President and Director",
 }
];

function filterDuplicate(myArr, prop) {
      // Format - (1)

      // return myArr.filter((obj, pos, arr) => {
      //     return arr.map(mapObj => mapObj[prop]).indexOf(obj[prop]) === pos;
      // });

      // Format - (2)
      var res = {};
      var resArr = [];
      for (var elem of myArr) {
        res[elem.executiveId] = elem;
      }
      for (let [index, elem] of Object.entries(res)) {
        resArr.push(elem);
      }
      return resArr;
  }
  
let finalRes = filterDuplicate(a,"executiveId");
console.log("finalResults : ",finalRes);
sg28
  • 1,363
  • 9
  • 19
0

You can also create a generic function which will filter the array based on the object key you pass to the function

function getUnique(arr, comp) {

  return arr
   .map(e => e[comp])
   .map((e, i, final) => final.indexOf(e) === i && i)  // store the keys of the unique objects
   .filter(e => arr[e]).map(e => arr[e]); // eliminate the dead keys & store unique objects

 }

and you can call the function like this,

getUnique(things.thing,'name') // to filter on basis of name

getUnique(things.thing,'place') // to filter on basis of place
Sksaif Uddin
  • 642
  • 1
  • 15
  • 22
0

If you want to de-duplicate your array based on all arguments and not just one. You can use the uniqBy function of lodash that can take a function as a second argument.

You will have this one-liner:

 _.uniqBy(array, e => { return e.place && e.name })
Simon
  • 6,025
  • 7
  • 46
  • 98
0
function dupData() {
  var arr = [{ comment: ["a", "a", "bbb", "xyz", "bbb"] }];
  let newData = [];
  comment.forEach(function (val, index) {
    if (comment.indexOf(val, index + 1) > -1) {
      if (newData.indexOf(val) === -1) { newData.push(val) }
    }
  })
}
Mario Petrovic
  • 7,500
  • 14
  • 42
  • 62
Aman Singh
  • 347
  • 2
  • 4
0
    function genFilterData(arr, key, key1) {
      let data = [];
      data = [...new Map(arr.map((x) => [x[key] || x[key1], x])).values()];
    
      const makeData = [];
      for (let i = 0; i < data.length; i += 1) {
        makeData.push({ [key]: data[i][key], [key1]: data[i][key1] });
      }
    
      return makeData;
    }
    const arr = [
    {make: "here1", makeText:'hj',k:9,l:99},
    {make: "here", makeText:'hj',k:9,l:9},
    {make: "here", makeText:'hj',k:9,l:9}]

      const finalData= genFilterData(data, 'Make', 'MakeText');
    
        console.log(finalData);
0

If you are using Lodash library you can use the below function as well. It should remove duplicate objects.

var objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
_.uniqWith(objects, _.isEqual);
Tomer Shetah
  • 8,413
  • 7
  • 27
  • 35
0

We can leverage Javascript's Set object and Array's Filter function: For example:

// Example Array
const arr = [{ id: '1' }, { id: '2' }, { id: '1' }];
// Gather Unique Element Id's based on which you want to filter the elements.
const uniqIds = arr.reduce((ids, el) => ids.add(el.id), new Set());
// Filter out uniq elements.
const uniqElements = arr.filter((el) => uniqIds.delete(el.id));

console.log(uniqElements);
0

You can use for loop and condition to make it unique

const data = [
{ id: 1 },
{ id: 2 },
{ id: 3 },
{ id: 4 },
{ id: 5 },
{ id: 6 },
{ id: 6 },
{ id: 6 },
{ id: 7 },
{ id: 8 },
{ id: 8 },
{ id: 8 },
{ id: 8 }
];

const filtered= []

for(let i=0; i<data.length; i++ ){
    let isHasNotEqual = true
    for(let j=0; j<filtered.length; j++ ){
      if (filtered[j].id===data[i].id){
          isHasNotEqual=false
      }
    }
    if (isHasNotEqual){
        filtered.push(data[i])
    }
}
console.log(filtered);

/*
output
[ { id: 1 },
  { id: 2 },
  { id: 3 },
  { id: 4 },
  { id: 5 },
  { id: 6 },
  { id: 7 },
  { id: 8 } ]

*/








Fajrul Aulia
  • 147
  • 2
  • 11
0

That's my solution by adding the actual array into key value object where the key is going to be the unique identify and the value could be any property of the object or the whole object.

Explanation: The main array with duplicate items will be transformed to a key/value object If the Id already exist in the unique object the value will be override. At the end just convert the unique object into an array.

getUniqueItems(array) {       
        const unique = {};
        // here we are assigning item.name but it could be a complete object.
        array.map(item => unique[item.Id] = item.name);
        // here you can transform your array item like {text: unique[key], value: key} but actually you can do what ever you want
        return Object.keys(unique).map(key => ({text: unique[key], value: key}));
      })
    );
  }
Abel Valdez
  • 2,368
  • 1
  • 16
  • 33
0

TypeScript function to filter an array to its unique elements where uniqueness is decided by the given predicate function:

function uniqueByPredicate<T>(arr: T[], predicate: (a: T, b: T) => boolean): T[] {
  return arr.filter((v1, i, a) => a.findIndex(v2 => predicate(v1, v2)) === i);
}

Without typings:

function uniqueByPredicate(arr, predicate) {
  return l.filter((v1, i, a) => a.findIndex(v2 => predicate(v1, v2)) === i);
}
zr0gravity7
  • 2,917
  • 1
  • 12
  • 33
-1

This is simple way how to remove duplicity from array of objects.

I work with data a lot and this is useful for me.

const data = [{name: 'AAA'}, {name: 'AAA'}, {name: 'BBB'}, {name: 'AAA'}];
function removeDuplicity(datas){
    return datas.filter((item, index,arr)=>{
    const c = arr.map(item=> item.name);
    return  index === c.indexOf(item.name)
  })
}

console.log(removeDuplicity(data))

will print into console :

[[object Object] {
name: "AAA"
}, [object Object] {
name: "BBB"
}]
Sunil
  • 3,404
  • 10
  • 23
  • 31
Juraj
  • 260
  • 2
  • 5
  • This solution is designed to remove duplicity from static array, but when u push data from backend into data array then consider using replace. Because in this case the the new value pushed to data array will be removed and the "old" value will still stored in data array. – Juraj Feb 11 '18 at 20:25
-1
function filterDuplicateQueries(queries){
    let uniqueQueries = [];
     queries.forEach((l, i)=>{
        let alreadyExist = false;
        if(uniqueQueries.length>0){
            uniqueQueries.forEach((k, j)=>{
                if(k.query == l.query){
                    alreadyExist = true;
                }
            });
        }
        if(!alreadyExist){
           uniqueQueries.push(l)
        }
    });
ARUN ARUMUGAM
  • 43
  • 1
  • 8
-1

var things = new Object();

things.thing = new Array();

things.thing.push({place:"here",name:"stuff"});
things.thing.push({place:"there",name:"morestuff"});
things.thing.push({place:"there",name:"morestuff"});
console.log(things);
function removeDuplicate(result, id) {
    let duplicate = {};
    return result.filter(ele => !duplicate[ele[id]] &&                   (duplicate[ele[id]] = true));
}
let resolverarray = removeDuplicate(things.thing,'place')
console.log(resolverarray);
Shijo Rs
  • 159
  • 1
  • 2
-3

Here is a solution using new filter function of JavaScript that is quite easy . Let's say you have an array like this.

var duplicatesArray = ['AKASH','AKASH','NAVIN','HARISH','NAVIN','HARISH','AKASH','MANJULIKA','AKASH','TAPASWENI','MANJULIKA','HARISH','TAPASWENI','AKASH','MANISH','HARISH','TAPASWENI','MANJULIKA','MANISH'];

The filter function will allow you to create a new array, using a callback function once for each element in the array. So you could set up the unique array like this.

var uniqueArray = duplicatesArray.filter(function(elem, pos) {return duplicatesArray.indexOf(elem) == pos;});

In this scenario your unique array will run through all of the values in the duplicate array. The elem variable represents the value of the element in the array (mike,james,james,alex), the position is it's 0-indexed position in the array (0,1,2,3...), and the duplicatesArray.indexOf(elem) value is just the index of the first occurrence of that element in the original array. So, because the element 'james' is duplicated, when we loop through all of the elements in the duplicatesArray and push them to the uniqueArray, the first time we hit james, our "pos" value is 1, and our indexOf(elem) is 1 as well, so James gets pushed to the uniqueArray. The second time we hit James, our "pos" value is 2, and our indexOf(elem) is still 1 (because it only finds the first instance of an array element), so the duplicate is not pushed. Therefore, our uniqueArray contains only unique values.

Here is the Demo of above function.Click Here for the above function example