1

I have the following JSON -

{  
   "node1":[  
      {  
         "one":"foo",
         "two":"foo",
         "three":"foo",
         "four":"foo"
      },
      {  
         "one":"bar",
         "two":"bar",
         "three":"bar",
         "four":"bar"
      },
      {  
         "one":"foo",
         "two":"foo",
         "three":"foo",
         "four":"foo"
      }
   ],
   "node2":[  
      {  
         "link":"baz",
         "link2":"baz"
      },
      {  
         "link":"baz",
         "link2":"baz"
      },
      {  
         "link":"qux",
         "link2":"qux"
      },

   ]
};

I have the following javascript that will remove duplicates from the node1 section -

function groupBy(items, propertyName) {
          var result = [];
          $.each(items, function (index, item) {
              if ($.inArray(item[propertyName], result) == -1) {
                  result.push(item[propertyName]);
              }
          });
          return result;
      }

groupBy(catalog.node1, 'one');

However this does not account for dupicates in node2.

The resulting JSON I require is to look like -

{  
   "node1":[  
      {  
         "one":"foo",
         "two":"foo",
         "three":"foo",
         "four":"foo"
      },
      {  
         "one":"bar",
         "two":"bar",
         "three":"bar",
         "four":"bar"
      }
   ],
   "node2":[  
      {  
         "link":"baz",
         "link2":"baz"
      },
      {  
         "link":"qux",
         "link2":"qux"
      },

   ]
};

However I cannot get this to work and groupBy only returns a string with the duplicates removed not a restructured JSON?

Ebikeneser
  • 2,582
  • 13
  • 57
  • 111

3 Answers3

2

Here is my version:

  var obj = {} // JSON object provided in the post.

  var result = Object.keys(obj);

  var test = result.map(function(o){  
      obj[o] = obj[o].reduce(function(a,c){
        if (!a.some(function(item){
                    return JSON.stringify(item) === JSON.stringify(c); })){
           a.push(c);
        } 
        return a; 
      },[]); return obj[o]; });

  console.log(obj);//outputs the expected result

Using Array.prototype.reduce along with Array.prototype.some I searched for all the items being added into the new array generated into Array.prototype.reduce in the var named a by doing:

a.some(function(item){ return JSON.stringify(item) === JSON.stringify(c); })

Array.prototype.some will loop trough this new array and compare the existing items against the new item c using JSON.stringify.

Community
  • 1
  • 1
Dalorzo
  • 19,834
  • 7
  • 55
  • 102
  • I understand your logic however this is erroring with - 0x800a01b6 - JavaScript runtime error: Object doesn't support property or method 'keys' I think htis is connected with when using keys in IE8 however I am using IE11?? Also I tried it in Chrome it only brings back [object][object] – Ebikeneser Aug 21 '14 at 14:36
  • This is compatible with ES5 meaning it works with IE9+ if you need this to work with earlier versions you need to polyfill as recommended here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/keys – Dalorzo Aug 21 '14 at 15:12
2

You should probably look for some good implementation of a JavaScript set and use that to represent your node objects. The set data structure would ensure that you only keep unique items.

On the other hand, you may try to write your own dedup algorithm. This is one example

function dedup(data, equals){
    if(data.length > 1){
        return data.reduce(function(set, item){
            var alreadyExist = set.some(function(unique){
                return equals(unique, item);
            });
            if(!alreadyExist){
                set.push(item)
            }
            return set;
        },[]);
    }
    return [].concat(data);
}

Unfortunately, the performance of this algorithm is not too good, I think somewhat like O(n^2/2) since I check the set of unique items every time to verify if a given item exists. This won't be a big deal if your structure is really that small. But at any rate, this is where a hash-based or a tree-based algorithm would probably be better.

You can also see that I have abstracted away the definition of what is "equal". So you can provide that in a secondary function. Most likely the use of JSON.stringify is a bad idea because it takes time to serialize an object. If you can write your own customized algorithm to compare key by key that'd be probably better.

So, a naive (not recommended) implementation of equals could be somewhat like the proposed in the other answer:

var equals = function(left, right){ 
  return JSON.stringify(left) === JSON.stringify(right); 
};

And then you could simply do:

var res = Object.keys(source).reduce(function(res, key){
    res[key] = dedup(source[key], equals);
    return res;
},{});
Community
  • 1
  • 1
Edwin Dalorzo
  • 76,803
  • 25
  • 144
  • 205
  • 1
    Spot on works a treat, the runtime isnt really an issue for me as it is small data set that I am running against. Well structured and informative answer +1. – Ebikeneser Aug 25 '14 at 10:02
0

Try this:

 var duplicatedDataArray = [];
    var DuplicatedArray = [];

    //Avoiding Duplicate in Array Datas
    var givenData = {givenDataForDuplication : givenArray};
    $.each(givenData.givenDataForDuplication, function (index, value) {
        if ($.inArray(value.ItemName, duplicatedDataArray) == -1) {
            duplicatedDataArray.push(value.ItemName);
            DuplicatedArray.push(value);
        } 
    });
Suganth G
  • 5,136
  • 3
  • 25
  • 44