-1

Inspired by the article How to merge two arrays in JavaScript and de-duplicate items I have tested how fast and reliable the different variants are for large number of items. I also like to make another suggestion to solve the problem.

Severin
  • 183
  • 2
  • 15

1 Answers1

1

in the example can be tried which function and when faster. https://dojo.telerik.com/oYuCE/4

The function uniqueArray2 works well and fast on large items.

The jQuery function $ .unique works fast, but only if the items are sorted. Otherwise, duplicates will not be filtered. Sampel:

var arr1 = ['Vijendra', 'Singh'], arr2 = ['Shakya', 'Singh']; 
alert (JSON.stringify ($ .unique (arr1.concat (arr2))));

Filters no duplicates, however:

var arr1 = ['Vijendra', 'Singh'], arr2 = ['Singh,' Shakya ''];
alert (JSON.stringify ($ .unique (arr1.concat (arr2))));

Filters the duplicates.

The function uniqueArray2 works well:

function arrayUnique2 (array) {
var tmpArrAss = {};
  $ .each (array, function (key, vs) {
    tmpArrAss [hashCode (vs)] = vs;
  });
  var tmpArr = new Array ();
  $ .each (tmpArrAss, function (key, vs) {
    if (vs.length> 0) {
      tmpArr.push (vs);
    }
  });
  return tmpArr;
}

but needs an encoder for long strings, this is here with:

function hashCode (s) {
return s.split ("") reduce (function (a, b) {a = ((a << 5) -a) + b.charCodeAt (0); return a & a}, 0);
}

solved. This has the disadvantage that the function works on very long strings.

arrayUnique2 is 20 time faster.

Severin
  • 183
  • 2
  • 15