4

Possible Duplicate:
Array unique values
Get unique results from JSON array using jQuery

Im having a JSON string like this

[
 Object { id="38",product="foo"},
 Object { id="38",product="foo"},
 Object { id="38",product="foo"},
 Object { id="39",product="bar"},
 Object { id="40",product="hello"},
 Object { id="40",product="hello"}

]

There are duplicate values in this JSON array..how can i make this JSON array unique like this

[
 Object { id="38",product="foo"},
 Object { id="39",product="bar"},
 Object { id="40",product="hello"}
]

.Im looking for a suggestion that uses less iterations, Jquery $.inArray is not working in this case.

Suggestion to use any third party libraries are welcome.

Community
  • 1
  • 1
coolguy
  • 7,866
  • 9
  • 45
  • 71
  • Would you like to remove elements with the same id/product combination or just with the same id? – davids Aug 06 '12 at 10:18
  • 2
    The most efficient way to do that is to turn the array into a hash with elements as keys. Then turn the hash into an array. Should give you `O(n)`. – freakish Aug 06 '12 at 10:21
  • Actually this has nothing to do with JSON at all ;) – Torsten Walter Aug 07 '12 at 08:36
  • so how do you do it @TorstenWalter ..any ideas are welcome :) – coolguy Aug 07 '12 at 08:39
  • 3
    I have answered below :) I was referring to the tag `JSON`. Whether the `array` comes from a `JSON` string, is entered manually or generated by JavaScript code is irrelevant for the intended task, which is to filter the array so it only contains distinct values. I don't mean to bash, just try to educate. – Torsten Walter Aug 07 '12 at 08:45
  • One-liner Solution: const uniqueProduct = [... new Set(productArr.map(JSON.stringify))].map(JSON.parse) – Pankaj Shinde Mar 18 '21 at 12:43

5 Answers5

6

You can use underscore's uniq.

In your case, you need to provide an iterator to extract 'id':

array = _.uniq(array, true /* array already sorted */, function(item) {
  return item.id;
});
Laurent Perrin
  • 14,671
  • 5
  • 50
  • 49
3

// Assuming first that you had **_valid json_**
myList= [
    { "id":"38","product":"foo"},
    { "id":"38","product":"foo"},
    { "id":"38","product":"foo"},
    { "id":"39","product":"bar"},
    { "id":"40","product":"hello"},
    { "id":"40","product":"hello"}
];

// What you're essentially attempting to do is turn this **list of objects** into a **dictionary**.
var newDict = {}

for(var i=0; i<myList.length; i++) {
    newDict[myList[i]['id']] = myList[i]['product'];
}

// `newDict` is now:
console.log(newDict);
Tushar Walzade
  • 3,737
  • 4
  • 33
  • 56
Aesthete
  • 18,622
  • 6
  • 36
  • 45
  • 1
    This is good if you need the data in key value pairs. However, if you need the keys as they are, say for a template, this isn't the best solution. – Torsten Walter Aug 06 '12 at 11:42
1

Check the solution in the following SO question:

Get unique results from JSON array using jQuery

You'll have to iterate through your array and create a new array which contains unique values.

Community
  • 1
  • 1
Flater
  • 12,908
  • 4
  • 39
  • 62
1

You will probably have to loop through removing the duplicates. If the items stored are in order as you have suggested, it's a simple matter of a single loop:

function removeDuplicates(arrayIn) {
    var arrayOut = [];
    for (var a=0; a < arrayIn.length; a++) {
        if (arrayOut[arrayOut.length-1] != arrayIn[a]) {
            arrayOut.push(arrayIn[a]);
        }
    }
    return arrayOut;
}
SReject
  • 3,774
  • 1
  • 25
  • 41
0

You can easily code this yourself. From the top of my head this comes to mind.

var filtered = $.map(originalArray, function(item) {
    if (filtered.indexOf(item) <= 0) {
        return item;
    }
});

Or as suggested a more efficient algorithm specifically for the case at hand:

var helper = {};
var filtered = $.map(originalArray, function(val) {
    var id = val.id;

    if (!filtered[id]) {
        helper[id] = val;
        return val;
    }
});
helper = null;
Torsten Walter
  • 5,614
  • 23
  • 26
  • Quite inefficient. In worst scenario `O(n^2)`. – freakish Aug 06 '12 at 10:20
  • If the data returned really has the ids, then of course a hash map with id keys would be the most efficient. – Torsten Walter Aug 06 '12 at 10:22
  • +1: Exactly. If the OP wants an array, then he can easily convert `filtered` into an array. This solution should be `O(n)` in terms of computational complexity although uses at least twice as much memory. Shouldn't be a problem, though ( who would send such big JSONs to client side anyway? ). – freakish Aug 06 '12 at 10:27
  • Memory should not be significantly more since the object and array contain only references to the same objects. – Torsten Walter Aug 06 '12 at 11:40