0

I have a C# thread pool class that is based heavily on the producer/consumer code from https://stackoverflow.com/a/1656662/782181. NOTE: I'm doing this instead of using BlockingCollection because I'm stuck with .NET2.0!

I added a function to the class that can be called from the main thread to allow the main thread to do some work. My thinking here was that, at some point, the main thread waits for work to be done, but instead of waiting, I could also have the main thread do some of the work to speed things up.

Here's a slimmed version of the class to demonstrate:

public static class SGThreadPool
{
    // Shared object to lock access to the queue between threads.
    private static object locker = new object();

    // The various threads that are doing our work.
    private static List<Thread> workers = null;

    // A queue of tasks to be completed by the workers.
    private static Queue<object> taskQueue = new Queue<object>();
    private static Queue<WaitCallback> taskCallbacks = new Queue<WaitCallback>();

    //OMMITTED: Init function (starts threads)

    // Enqueues a task for a thread to do.
    public static void EnqueueTask(WaitCallback callback, object context)
    {
        lock(locker)
        {
            taskQueue.Enqueue(context);
            taskCallbacks.Enqueue(callback);
            Monitor.PulseAll(locker); //Q: should I just use 'Pulse' here?
        }
    }

    // Can be called from main thread to have it "help out" with tasks.
    public static bool PerformTask()
    {
        WaitCallback taskCallback = null;
        object task = null;
        lock(locker)
        {
            if(taskQueue.Count > 0)
            {
                task = taskQueue.Dequeue();
            }
            if(taskCallbacks.Count > 0)
            {
                taskCallback = taskCallbacks.Dequeue();
            }
        }

        // No task means no work, return false.
        if(task == null || taskCallback == null) { return false; }

        // Do the work!
        taskCallback(task);
        return true;
    }

    private static void Consume()
    {
        while(true)
        {
            WaitCallback taskCallback = null;
            object task = null;
            lock(locker)
            {
                // While no tasks in the queue, wait.
                while(taskQueue.Count == 0)
                {
                    Monitor.Wait(locker);
                }

                // Get a task.
                task = taskQueue.Dequeue();
                taskCallback = taskCallbacks.Dequeue();
            }

            // Null task signals an exit.
            if(task == null || taskCallback == null) { return; }

            // Call consume callback with task as context.
            taskCallback(task);
        }
    }
}

Basically, I can enqueue a number of tasks to be performed by background threads. But it is also possible for the main thread to take a task and perform it by calling PerformTask().

I'm running into an infrequent problem where the main thread will try to "lock" in PerformTask(), but the lock is already taken. The main thread waits, but the lock doesn't ever become available, for some reason.

Nothing in the code is jumping out at me as a problem causing the deadlock - I'm hoping that someone else might be able to spot the problem. I've been looking at this for a couple hours, and I'm not sure why the main thread would get stuck at the "lock()" call in PerformTask(). It seems like no other thread would be holding the lock indefinitely? Is it a bad idea to allow the main thread to interact with the pool in this way?

Community
  • 1
  • 1
kromenak
  • 479
  • 5
  • 13
  • Maybe, you want to try `BlockingCollection` instead of queues+locks – L.B Sep 14 '16 at 21:56
  • @L.B, I actually think that would be great, but I'm working in Unity, which is stuck at .NET2.0 for the time being! I'll update the question to indicate this. – kromenak Sep 14 '16 at 22:03

1 Answers1

0

Hmm, so, while I would still like to know why the code above could deadlock in certain scenarios, I think I've found a workaround that will do the trick.

If the main thread is going to be doing work here, I want to make sure the main thread doesn't get blocked for a long period of time. After all, a general dev rule: don't block the main thread!

So, the solution I'm trying is to use Monitor.TryEnter directly, rather than using lock() for the main thread. This allows me to specify a timeout for how long the main thread is willing to wait for the lock.

public static bool PerformTask()
{
    WaitCallback taskCallback = null;
    object task = null;

    // Use TryEnter, rather than "lock" because 
    // it allows us to specify a timeout as a failsafe.
    if(Monitor.TryEnter(locker, 500))
    {
        try 
        {
            // Pull a task from the queue.
            if(taskQueue.Count > 0)
            {
                task = taskQueue.Dequeue();
            }
            if(taskCallbacks.Count > 0)
            {
                taskCallback = taskCallbacks.Dequeue();
            }
        }
        finally
        {
            Monitor.Exit(locker);
        }
    }

    // No task means no work, return false.
    if(task == null || taskCallback == null) { return false; }

    // Do the work!
    taskCallback(task);
    return true;
}

In this code, the thread will wait to acquire the lock for up to 500ms. If it can't for whatever reason, it fails to do any tasks, but at least it doesn't get stuck. It seems like a good idea to not put the main thread in a position where it could wait indefinitely.

I believe that when you use lock(), the compiler generates similar code anyways, so I don't think there would be any performance issue with this solution.

kromenak
  • 479
  • 5
  • 13
  • 1
    your approach is correct, but I just want to point out to your code for dequeue - I suggest you to assert the equality of counts in both queue. It seems unreal to be unequal, but I think this should be done. Just for the check your invariants. – VMAtm Sep 16 '16 at 18:00