Let's say I have an array of ten million items. I want to do some operation on each item in a foreach loop and then return that item.
foreach(var item in items)
{
/Lets pretend this is resource-intensive
item.someIntProp++;
}
Would breaking up the ten million item into, say, 100k item batches and then running each batch in an async operation be any faster?
The actual scenario is needing to map a bunch of objects from mongoDb bson values into .NET objects using automapper. No database calls are made during this process, .NET is just converting bsonString to string, etc.
On one hand it would seem to me that "Yes, it will be faster because multiple batches will be handled simultaneously rather than in order." On the other hand, that seems ridiculous that it wouldn't already be optimizing that.