I thought that my problem is the following -- how to prepare for parallel execution?
My current code resemble this pattern:
var worker = new Worker();
while (true)
worker.RandomTest();
RandomTest
is mostly CPU-bound. On rare occasions, workers will write to a file (after obtaining a lock on it), but this is so infrequent that I/O delays shouldn't need to be considered.
I would like to run this in parallel, I could use Parallel.For
and use some big number as an upper limit (instead while
), but I have problems how to prepare for such execution, because I don't want to create worker at each iteration.
To prepare I would need to know parallel pool size in advance and also inside loop the index of the thread/task/job so I would know how to associate current execution path with the worker.
As it turns out my problem is more fundamental. I couldn't figure out how to make preparation, so since I have endless loop I though why not:
Parallel.Loop(0,1000,(i,_) => {
var worker = new Worker();
while (true)
worker.RandomTest();
}
Till now I assumed the launched executions will be in sane limits in comparison to available CPU cores. But no -- Parallel creates new iterations like crazy, so already created jobs are basically stalled because of the incoming flood of the new ones.
Of course I can hardcode the fixed number how many parallel jobs to run, but then the responsibility of figuring out how much is not too much and not too little is on me.
I know how to put a fixed number, but how to put a good number? I.e. when I run program on different machine or in different conditions (for example with CPU-demanding process in the background)?
Here Worker
is not thread safe type, thus I create as many worker as there are jobs to do.