This is easily done with a Semaphore.
The idea is to create a semaphore with a maximum count of N, where N is the number of threads you allow. The loop waits on the semaphore and queues tasks as it acquires the semaphore.
Semaphore ThreadsAvailable = new Semaphore(10, 10);
while (Queue.Count > 0)
{
ThreadsAvailable.WaitOne();
// Must dequeue item here, otherwise you could run off the end of the queue
ThreadPool.QueueUserWorkItem(DoStuff, Queue.Dequeue());
}
// Wait for remaining threads to finish
int threadCount = 10;
while (threadCount != 0)
{
ThreadsAvailable.WaitOne();
--threadCount;
}
void DoStuff(object item)
{
ItemType theItem = (ItemType)item;
// process the item
StartProcessing(item);
// And then release the semaphore so another thread can run
ThreadsAvailable.Release();
}
The item is dequeued in the main loop because that avoids a race condition that otherwise is rather messy to handle. If you let the thread dequeue the item, then the thread has to do this:
lock (queue)
{
if (queue.Count > 0)
item = queue.Dequeue();
else
// There wasn't an item to dequeue
return;
}
Otherwise, the following sequence of events is likely to occur when there is only one item left in the queue.
main loop checks Queue.Count, which returns 1
main loop calls QueueUserWorkItem
main loop checks Queue.Count again, which returns 1 because the thread hasn't started yet
new thread starts and dequeues an item
main loop tries to dequeue an item and throws an exception because queue.Count == 0
If you're willing to handle things that way, then you're okay. The key is making sure that the thread calls Release
on the semaphore before the thread exits. You can do that with explicitly managed threads, or with the ThreadPool
approach that I posted. I just used ThreadPool
because I find it easier than explicitly managing threads.