Background
Say I have an initial array of objects:
var initialData = [
{
'ID': 1,
'FirstName': 'Sally'
},
{
'ID': 2,
'FirstName': 'Jim'
},
{
'ID': 3,
'FirstName': 'Bob'
}
];
I then get new data (another array of objects):
var newData = [
{
'ID': 2,
'FirstName': 'Jim'
},
{
'ID': 4,
'FirstName': 'Tom'
},
{
'ID': 5,
'FirstName': 'George'
}
];
Goal
I want to merge the new data into initial data. However, I don't want to overwrite any objects in the initial data array. I just want to add in objects that weren't already there.
I know the objects are duplicates based on their 'ID'
key.
What I've Tried
I know I can do this by looping through the new data, checking to see if it exists in the initial data, and if not, pushing into initial data.
for ( var i = 0, l = newData.length; i < l; i++ ) {
if ( ! key_exists( newData[i].key, initialData ) ) { // key_exists() is a function that uses .filter() to test.
initialData.push( newData[i] );
}
}
I'm concerned about performance, though. I know there are lots of new ES6 ways of manipulating arrays, so I'm hoping someone has a better idea.
Question
What is the best way (best as in best performance) of merging the new data into the initial data, while ignoring duplicates in new data?