My application needs to perform a number of tasks per tenant on a minute-to-minute basis. These are fire-and-forget operations, so I don't want to use Parallel.ForEach to handle this.
Instead I'm looping through the list of tenants, and firing off a ThreadPool.QueueUserWorkItem to process each tenants task.
foreach (Tenant tenant in tenants)
{
ThreadPool.QueueUserWorkItem(new WaitCallback(ProcessTenant), tenantAccount);
}
This code works perfectly in production, and can generally process over 100 tenants in under 5 seconds.
However on application startup this causes 100% CPU utilization while things like EF get warmed up during the startup process. To limit this I've implemented a semaphore as follows:
private static Semaphore _threadLimiter = new Semaphore(4, 4);
The idea is to limit this task processing to only be able to use half of the machines logical processors. Inside the ProcessTenant method I call:
try
{
_threadLimiter.WaitOne();
// Perform all minute-to-minute tasks
}
finally
{
_threadLimiter.Release();
}
In testing, this appears to work exactly as expected. CPU utilization on startup stays at around 50% and does not appear to affect how quickly initial startup takes.
So question is mainly around what is actually happening when WaitOne is called. Does this release the thread to work on other tasks - similar to asynchronous calls? The MSDN documentation states that WaitOne: "Blocks the current thread until the current WaitHandle receives a signal."
So I'm just wary that this won't actually allow my web app to continue to utilize this blocked thread while it's waiting, which would make the whole point of this exercise meaningless.