I have a Windows service that processes jobs. A job in this service is a sequence of actions, such as:
A -> B -> C -> D -> E
Each job is supposed to be completely independent from other jobs. The current implementation processes each job on its own thread (in this case, new System.Threading.Thread
), which works well most of the time. However, it turns out that the B action is not thread-safe and two threads should not process the B action at the same time. If two threads try to perform the B action at the same time, strange results or errors are sometimes occurring. Note that the other actions may be executed concurrently by multiple jobs without any ill effects.
Rather than attempting to re-implement the B action in a thread-safe manner (which I anticipate would require a large amount of resources to implement and test), I would like to just limit each the B action to a single thread at a time.
Now the obvious initial solution to me was be to put a lock
around the B action, which would certainly limit it to a single thread. However, it appears that this solution would not guarantee that all waiting threads will get a chance to process the B action. This is because threads are not guaranteed to queue up in a FIFO or consistent manner. I feel like this could theoretically lead to thread starvation. Thus, I am not comfortable implementing this solution.
So my question is how can I implement a better (probably FIFO) job queue in .NET that guarantees that all jobs will make it through the B action, one by one?
My current thought is that I could maintain a some kind of "manager" thread whose job is to pull a job thread from a queue and execute it when the B action is clear. (Perhaps I am describing implementing my own scheduler?) But it seems like that may be rather crude and that perhaps .NET (I'm using .NET 4.0) has better tools in its library for this scenario.