I am trying to understand and state the Big O Notation of the following algorithm using reduce()
. My understanding is that reduce is a function applied to an array object. It takes in a callback & an initialValue. The code below is a controller that holds the algorithm:
export const getPublicID = (req, res) => {
const data = req.body.result;
if (!data) res.status(422).json({ success: false, message: 'Upload Failed!' });
console.time('ARRAY');
const insertStuff = data.reduce((array, item) => {
array.push({
public_id: item.public_id,
url: item.url,
thumbnail_url: item.thumbnail_url
});
return array;
}, []);
console.timeEnd('ARRAY');
Photo.insertMany(insertStuff)
.then(
img => res.status(201).json({
success: true,
message: 'Successfully added to database.',
cloudinary: img
}),
err => res.status(422).json({ success: false, message: err })
);
};
The req.body.result
comes in as an array of objects and through the reduce method I am creating my own array of objects that I then insert in my MongoDB collection.
Reduce is looping through the array so my thought is this is O(n), since the more elements present the more time it will take to iterate over, thus a linear graph. If that is the correct assumption my three questions are how do the following effect my algorithm:
push()
insertMany()
- the promise
Thanks for helping a noob to data structures and algorithms out with understanding the pros and cons of the code, I greatly appreciate it!