I have problem with my library for neural networks. It uses multithreading to fasten computations. But after about 30-60 sec of runtime my program does not utilize 100% of my i7 3610QM 4cores 8threads anymore.
Basically my processing looks like (c# with pseudocode)
for each training example t in training set
for each layer l in neural network
Parallel.For(0, N, (int i)=>{l.processForward(l.regions[i])})
for each layer l in neural network (but with reversed order)
Parallel.For(0, N, (int i)=>{l.backPropageteError(l.regions[i])})
Where regions is layer's list of precalculated regions of neuron to process. Every region is the same size of 1/N of current layer so Tasks are same size to minimize chance that other threads need to wait for longest task to finish.
Like i said, this processing scheme is consuming 100% of my processor only for a short time and then drops to about 80-85%. In my case i set N to Environment.ProcessorsCount (= 8);
I can share whole code/repository if anyone is willing to help.
I tried to investigate and I created new console project and put there almost Hello World of Parallel.For() and i simply can't tell what is going on. This might be other issue of Parallel.For() but i also want you to address this problem. Here is the code:
class Program
{
static void Main(string[] args)
{
const int n = 1;
while (true)
{
//int counter = 0; for (int ii = 0; ii < 1000; ++ii) counter++;
Parallel.For(0, n, (int i) => { int counter = 0; for (int ii = 0; ii < 1000; ++ii) counter++; });
}
}
}
In this code, I constantly (while loop) create one task (n=1) that has some work to do (increase counter one thousand times). As i know, Parallel.For blocks execution / waits for all parallel calls to finish. If that is true it should be doing the same work as commented section (provided n=1). But on my computer, this program uses 100% of CPU, like there is work for more than one thread! How is that possible? When i switch to commented version, program uses less than 20% of CPU and this is what I expected. Please help me understand this behaviour.