1

Delete duplicates in an array

  1. One of the function use the native method of JS ES6
  2. Is there a better method (compatibility and performance) ?

Note: filter the undefined element in the array

English is weak and maybe described as inaccurate. >﹏<

Thanks

function delRepeatArray1(arr) {
    console.time();
    var result = Array.from(new Set(arr));
    console.timeEnd();

    return result;
}

function delRepeatArray2(arr) {
    console.time();
    var result = arr.filter(function (em, index, arr) {
        return arr.indexOf(em) === index;
    });
    console.timeEnd();

    return result;
}

var arr = ["undefined", "200", 0, -0, 200, undefined, undefined, null, true, null, "true", false, 0, true, 200, false],
    result1 = delRepeatArray1(arr),
    result2 = delRepeatArray2(arr);

console.log(result1);
console.log(result2);
Community
  • 1
  • 1
Phoenix
  • 33
  • 1
  • 10
  • 1
    Possible duplicate of [Get all unique values in a JavaScript array (remove duplicates)](https://stackoverflow.com/questions/1960473/get-all-unique-values-in-a-javascript-array-remove-duplicates) – SLePort Feb 18 '19 at 08:43
  • Post your expected output please – Maheer Ali Feb 18 '19 at 08:52

4 Answers4

0

Here is quick snippet to get you unique value from the provided array:

var arr = ["undefined", "200", 0, -0, 200, undefined, undefined, null, true, null, "true", false, 0, true, 200, false];
var uniqueNames = [];
$.each(arr, function(i, el){
    if($.inArray(el, uniqueNames) === -1) uniqueNames.push(el);
});
console.log(uniqueNames);
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
Milan Chheda
  • 8,159
  • 3
  • 20
  • 35
0

I found a benchmark that compares a few methods for this. The fastest method was on my system is a plain and simple loop.

function unique(arr) {
  let out = [];
  const len = arr.length;
  for (let i = 0; i != len; ++i) {
    const obj = arr[i];
    if (out.indexOf(obj) == -1) out.push(obj);
  }
  return out;
}

Although, the benchmark shows that using reduce is faster on other systems.

function unique(arr) {
  return arr.reduce((out, obj) => {
    if (out.indexOf(obj) == -1) out.push(obj);
    return out;
  }, []);
}
Indiana Kernick
  • 5,041
  • 2
  • 20
  • 50
0

Cant we simply make it a set?

var arr = ["undefined", "200", 0, -0, 200, undefined, undefined, null, true, null, "true", false, 0, true, 200, false];
var filteredarr = arr.filter(Boolean); //filter undefined
var set = new Set(filteredarr); //remove duplicates
console.log(set);
Yugandhar Chaudhari
  • 3,831
  • 3
  • 24
  • 40
0

The fastest method is using Set to filter out the duplicates. On your small array there is no difference whatsoever, to benchmark your code you should try a larger array (e.g. at least 65k elements). Then you will see that regular loop (yours delRepeatArray2) it's much slower compare to Set that will basically keep the same time you have with a smaller group of items.

Said that, we're still talking about few milliseconds.

If you want to filter out the undefined, it's better you do it after you have reduce to the unique values, so you have only one undefined in the Set, and can be easily removed:

var set = new Set(arr);
set.delete(undefined);
return Array.from(set);
ZER0
  • 24,846
  • 5
  • 51
  • 54