Consider the following sample:
static void DoNothing()
{
signal.WaitOne();
}
static void Main(string[] args)
{
signal = new ManualResetEvent(false);
List<Thread> threads = new List<Thread>();
try
{
while (true)
{
Console.WriteLine($"{threads.Count}, Memory:{Process.GetCurrentProcess().PrivateMemorySize64 / (1024 * 1024)}");
var thread = new Thread(DoNothing);
thread.Start();
threads.Add(thread);
}
}
catch (OutOfMemoryException)
{
Console.WriteLine($"Out of memory at: {threads.Count}");
signal.Set();
}
threads.ForEach(t => t.Join());
Console.WriteLine("Finished.");
Console.ReadLine();
}
The code is compiled as a 32-bit process.
I discovered it behaves differently when compiled for .NET 3.5, and for 4.x. I only change the version of the Target framework.
When compiled for NET 3.5, the memory is exhausted with approx. 1 MB per thread created. This is as expected, because the default stack size is 1MB (https://msdn.microsoft.com/en-us/library/windows/desktop/ms686774(v=vs.85).aspx)
However, when compiled for .NET 4.x, the memory is consumed as pace of approx. 100KB per thread created, i.e. 1/10th of 1MB.
Did the default stack size changed between .NET 3.5 and 4.x?
I conduct the experiment on Windows 10. Is it possible this has to do with the version of Windows?