I would like to use the Parallel.Foreach mechanism to ensure full utilisation of the CPU for a CPU-intensive task. I am querying a significant amount of objects from a database one at a time (only one object within each iteration, each object reasonably small), and then performing a significant amount of CPU-based operations on this object, after which I save it back to the database.
I am using Entity Framework on the Data Model side, and given the amount of objects that I query I create a new Context for every iteration (this is to limit memory consumption):
foreach (var id in idlist)
{
using (var ctx = new Context())
{
var model = ctx.Models.First(x => x.Id == id);
await model.GlobalRefresh(true); //CPU heavy operation.
await model.SaveAsync(); //Additional CPU heavy operation.
ctx.SaveChanges(); //Save the changes
} //Dispose of the model and the context, to limit memory consumption
}
This works well in the synchronous implementation, as after each iteration both the model queried from the database and the Entity Framework context is disposed. My memory consumption during this process is therefore almost constant, which is great. If I don't create the context in this fashion, I quickly run out of memory (500+ objects).
When I set the above up in parallel as follows, my memory consumption goes sky high, as it seems that the context for each iteration is not disposed before the next iteration continues (and I do see significantly better CPU utilisation as expected):
Parallel.ForEach(idlist, async (id) =>
{
using (var ctx = new Context())
{
var model = ctx.Models.First(x => x.Id == id);
await model.GlobalRefresh(true);
await model.SaveAsync();
ctx.SaveChanges();
}
});
This is not necessarily a problem from a memory viewpoint, as long as all model objects aren't loaded into memory at once (this is also effectively the whole point of the parallel loop, to load more than one at a time). However, is there some way that I can manage this process better, e.g. not creating additional tasks when memory consumption reaches e.g. 75%, to avoid the Out Of Memory exception?