Why does the the CPU usage per core decreases as I increase the degree of parallelism in the following simple Parallel.For
loop? It ranges from 100% usage with 1 core
, to around 40% with 20 cores
(see image). I would expect that, since all variables are local, the cpu/core would always remain fairly constant.
public static void myTest()
{
Parallel.For(0, 50,new ParallelOptions { MaxDegreeOfParallelism = 20 }, (kk) =>
{
List<int> mylist = new List<int>();
while (true)
{
mylist.Add(1);
if (mylist.Count > 500000)
{
mylist = new List<int>();
}
}
});
}