What is the best way to accomplish a queue line for threads so that I can only have a max number of threads and if I already have that many the code waits for a free slot before continuing..
Pseudo codeish example of what I mean, Im sure this can be done in a better way...
(Please check the additional requirements below)
private int _MaxThreads = 10;
private int _CurrentThreads = 0;
public void main(string[] args)
{
List<object> listWithLotsOfItems = FillWithManyThings();
while(listWithLotsOfItems.Count> 0)
{
// get next item that needs to be worked on
var item = listWithLotsOfItems[0];
listWithLotsOfItems.RemoveAt(0);
// IMPORTANT!, more items can be added as we go.
listWithLotsOfItems.AddRange(AddMoreItemsToBeProcessed());
// wait for free thread slot
while (_CurrentThreads >= _MaxThreads)
Thread.Sleep(100);
Interlocked.Increment(ref _CurrentThreads); // risk of letting more than one thread through here...
Thread t = new Thread(new ParameterizedThreadStart(WorkerThread(item));
t.Start();
}
}
public void WorkerThread(object bigheavyObject)
{
// do heavy work here
Interlocked.Decrement(ref _CurrentThreads);
}
Looked at Sempahore
but that seems to be needing to run inside the threads and not outside before it is created. In this example the Semaphore is used inside the thread after it is created to halt it, and in my case there could be over 100k threads that need to run before the job is done so I would rather not create the thread before a slot is available.
(link to semaphore example)
In the real application, data can be added to the list of items as the program progresses so the Parallel.ForEach
won't really work either (I'm doing this in a script component in a SSIS package to send data to a very slow WCF).
SSIS has .Net 4.0