13

This isn't about the different methods I could or should be using to utilize the queues in the best manner, rather something I have seen happening that makes no sense to me.

void Runner() {
    // member variable
    queue = Queue.Synchronized(new Queue());
    while (true) {
        if (0 < queue.Count) {
            queue.Dequeue();
        }
    }
}

This is run in a single thread:

var t = new Thread(Runner);
t.IsBackground = true;
t.Start();

Other events are "Enqueue"ing else where. What I've seen happen is over a period of time, the Dequeue will actually throw InvalidOperationException, queue empty. This should be impossible seeing as how the count guarantees there is something there, and I'm positive that nothing else is "Dequeue"ing.

The question(s):

  1. Is it possible that the Enqueue actually increases the count before the item is fully on the queue (whatever that means...)?
  2. Is it possible that the thread is somehow restarting (expiring, reseting...) at the Dequeue statement, but immediately after it already removed an item?

Edit (clarification):

These code pieces are part of a Wrapper class that implements the background helper thread. The Dequeue here is the only Dequeue, and all Enqueue/Dequeue are on the Synchronized member variable (queue).

neouser99
  • 1,807
  • 1
  • 10
  • 23
  • Because of Ryan's answer ... is this the real code or just a simplified example? If it is the real code, you should really think about changing the loop - polling the queue instead of synchronizing the reader with the writers is a poor design. You are wasting millions of processor cycles to heat the room. – Daniel Brückner Apr 27 '09 at 16:59
  • This is an example to get to the meat of the problem. There is a Thread.Sleep in there, the processor is not being hammered. The reason I opted for a polling process instead of sync read/writers is due to the fact that the queue has something in it almost all the time. In our trunk though I've added an AutoResetEvent to play around with. Like I said at the top though, I'm not terribly concerned about implementation here. There seems to be a genuine problem with this threading model, be it right or wrong. – neouser99 Apr 27 '09 at 17:23
  • From looking at your code, you at least have the main thread and the thread where you think Dequeue is being called. Why not name your threads, and every time Dequeue is called, log the name of the thread with a stack trace. You might find that something in the main thread is behaving in a way you weren't expecting. – Brad Barker Apr 27 '09 at 19:11
  • You should be using a [BlockingCollection](http://msdn.microsoft.com/en-us/library/dd267312%28v=vs.110%29.aspx) for this, not a synchronized queue. – BlueRaja - Danny Pflughoeft Nov 12 '13 at 22:54

6 Answers6

12

Using Reflector, you can see that no, the count does not get increased until after the item is added.

As Ben points out, it does seem as you do have multiple people calling dequeue.

You say you are positive that nothing else is calling dequeue. Is that because you only have the one thread calling dequeue? Is dequeue called anywhere else at all?

EDIT:

I wrote a little sample code, but could not get the problem to reproduce. It just kept running and running without any exceptions.

How long was it running before you got errors? Maybe you can share a bit more of the code.

class Program
{
    static Queue q = Queue.Synchronized(new Queue());
    static bool running = true;

    static void Main()
    {
        Thread producer1 = new Thread(() =>
            {
                while (running)
                {
                    q.Enqueue(Guid.NewGuid());
                    Thread.Sleep(100);
                }
            });

        Thread producer2 = new Thread(() =>
        {
            while (running)
            {
                q.Enqueue(Guid.NewGuid());
                Thread.Sleep(25);
            }
        });

        Thread consumer = new Thread(() =>
            {
                while (running)
                {
                    if (q.Count > 0)
                    {
                        Guid g = (Guid)q.Dequeue();
                        Console.Write(g.ToString() + " ");
                    }
                    else
                    {
                        Console.Write(" . ");
                    }
                    Thread.Sleep(1);
                }
            });
        consumer.IsBackground = true;

        consumer.Start();
        producer1.Start();
        producer2.Start();

        Console.ReadLine();

        running = false;
    }
}
Erich Mirabal
  • 9,860
  • 3
  • 34
  • 39
  • Very nice, I actually have a test case that looks pretty close to this, and it gets the gist. The service was running for about 2+ weeks I think before it crashed. – neouser99 Apr 27 '09 at 20:03
  • 8
    At this point, all I can suggest is this: (1) You entered the If section, (2) A Solar Flare flipped your bit to zero, (3) Your queue threw an exception. It really should've been thrown a SolarFlareException, but whatever. Potatoh, Potahto. – Erich Mirabal Apr 27 '09 at 20:20
3

Here is what I think the problematic sequence is:

  1. (0 < queue.Count) evaluates to true, the queue is not empty.
  2. This thread gets preempted and another thread runs.
  3. The other thread removes an item from the queue, emptying it.
  4. This thread resumes execution, but is now within the if block, and attempts to dequeue an empty list.

However, you say nothing else is dequeuing...

Try outputting the count inside the if block. If you see the count jump numbers downwards, someone else is dequeuing.

Ben S
  • 68,394
  • 30
  • 171
  • 212
  • 2
    This was my first thought as well, but he stated "I'm positive that nothing else is "Dequeue"ing." – BFree Apr 27 '09 at 16:30
  • Kinda my thoughts, but I think he should clarify the facts that make him so certain that nothing else is calling Dequeue. – Erich Mirabal Apr 27 '09 at 16:37
  • +1. It's there's race condition between Count and Dequeue regardless of whether any other thread is *currently* dequeuing. Best to fix the race condition and move onto the real issue. – Curt Nichols Mar 16 '10 at 19:21
3

Here's a possible answer from the MSDN page on this topic:

Enumerating through a collection is intrinsically not a thread-safe procedure. Even when a collection is synchronized, other threads can still modify the collection, which causes the enumerator to throw an exception. To guarantee thread safety during enumeration, you can either lock the collection during the entire enumeration or catch the exceptions resulting from changes made by other threads.

My guess is that you're correct - at some point, there's a race condition happening, and you end up dequeuing something that isn't there.

A Mutex or Monitor.Lock is probably appropriate here.

Good luck!

Mike
  • 4,542
  • 1
  • 26
  • 29
2

Are the other areas that are "Enqueuing" data also using the same synchronized queue object? In order for the Queue.Synchronized to be thread-safe, all Enqueue and Dequeue operations must use the same synchronized queue object.

From MSDN:

To guarantee the thread safety of the Queue, all operations must be done through this wrapper only.

Edited: If you are looping over many items that involve heavy computation or if you are using a long-term thread loop (communications, etc.), you should consider having a wait function such as System.Threading.Thread.Sleep, System.Threading.WaitHandle.WaitOne, System.Threading.WaitHandle.WaitAll, or System.Threading.WaitHandle.WaitAny in the loop, otherwise it might kill system performance.

Ryan
  • 7,835
  • 2
  • 29
  • 36
  • "[...] make sure that you have System.Threading.Thread.Sleep(1) in any thread loop [...]"!?! What? Why should you do something like that? – Daniel Brückner Apr 27 '09 at 16:43
  • Ahhh ... got it. You mean while(true) { }. And no - you really should not add Thread.Sleep() to this loop. You should not add Thread.Sleep() to any loop. This is just very poor design and should be fixed - the reader thread should synchronize with the writer thread instead of polling the queue. – Daniel Brückner Apr 27 '09 at 16:54
  • 1
    Sleep allows the system to share CPU and resources otherwise the thread will consume significant resources (in some cases starve other threads and processes). Clearly this is sample code, but there is a while(true) with no exit condition so I mentioned this. Sleep(0) has issues with not relinquishing control to lower priority threads, so Sleep(1) is recommended. If the thread loop is timed or has a limited time span, then it may not be necessary to Sleep. I've seen a number of Windows services where a single Sleep statement changes CPU usage from near 80-100% to < 1%. – Ryan Apr 27 '09 at 17:11
  • I don't agree with "You should not add Thread.Sleep() to any loop." There are definite reasons: mainly Services using threads that run indefinitely (until stopped). As you mention there are other design patterns that could be used, and of course there are other ways to Sleep as well (see Richter concurrent affairs articles). But there are times and reasons when to use Sleep, but typically only for long running threads with scan cycles/loops and then use Sleep only once in the outermost thread loop. – Ryan Apr 27 '09 at 17:29
  • If Thread.Sleep(1) significantly reduces the processor usages, your loop does nothing most of the time. So you should use some form synchronization to have the thread only running if there is some work. Even if you bring the processor usage down to one percent, you are still wasting thousands of cycles - you have a simple if statement (maybe 10 cycles for a simple check) and two context switches (between 2000 and 8000 cyles each) per iteration. Or see it from the other side - 100 threads doing absolutly nothing will consume all of your processor time. – Daniel Brückner Apr 27 '09 at 18:55
  • If Thread.Sleep(1) does not reduce the processor usages there is a lot of work to be done and no need for Thread.Sleep(1). So I am quite sure there is not a single scenario justifying the use of Thread.Sleep() besides simulating high work load in test. Of course it is quick and easy to write code with Thread.Sleep(), but it is poor code in all cases. Counterexamples welcome. – Daniel Brückner Apr 27 '09 at 18:59
  • Exactly: many threads are actually waiting for events to happen. In this example, the thread is waiting for the queue to have objects added to it and most of the time it does nothing. This is the exact scenario I am referring to. Other examples include services/threads waiting for network, serial, or USB communications where most of the time the service is waiting for activity. If you do not use Sleep, then the thread will consume far more resources than needed when there is no activity. Using Sleep does not make code any quicker or easier to write; it is just using resources better. – Ryan Apr 27 '09 at 19:40
  • It was not my intention to hijack this question, but there are valid uses of Sleep. And by "Sleep" I mean any of the wait-able functions including Sleep, WaitOne, WaitAny, and others. neouser99 has that in the actual code, so this discussion isn't contributing to solving the problem. – Ryan Apr 28 '09 at 01:06
  • I would agree there are plenty of uses for Sleep, but in every case there is a 'better' solution - it just might not be anywhere near as practical. For example a thread that watches a directory periodically for new files. Ideally you would use FileSystemWatcher events or similar to detect the file on creation, but that wasn't always available. However listing the directory as often as the thread would run is also not practical, so a sensible period of sleep, based on how often you think the directory would have a new file added and how long is acceptable before you notice it, can be defined. – deed02392 Nov 08 '12 at 10:35
  • @deed02392 wow, you really dug up an old comment. In my experience, many of the engineering applications and embedded products that I've developed, a communications thread has been very important. Typically this is either serial or network data, and it runs constantly in a loop. I used the equivalent of WaitAny([serialEvent, abortEvent]) to detect incoming communications or if the abort is signaled. On older machines, this made a huge difference, but not so much on modern machines. I still think it might have a difference on a modern machine under heavy load as well. – Ryan Mar 07 '13 at 07:05
  • In one communications application, adding a 10 msec wait reduced the CPU usage from 90% down to 15%. Adding a wait function isn't necessary for many applications, but there are definitely scenarios where it makes a huge difference and in some cases it is the 'best' solution. Normally my usage was waiting for an event to occur (incoming communications) or for abort to be signaled. Your example (FileSystemWatcher) still uses an event internally along with IOCP and wait functions. However it is usually better to use existing classes and functionality, rather than building you own. – Ryan Mar 07 '13 at 07:22
1

question 1: If you're using a synchronized queue, then: no, you're safe! But you'll need to use the synchronized instance on both sides, the supplier and the feeder.

question 2: Terminating your worker thread when there is no work to do, is a simple job. However, you either way need a monitoring thread or have the queue start a background worker thread whenever the queue has something to do. The last one sounds more like the ActiveObject Pattern, than a simple queue (which's Single-Responsibily-Pattern says that it should only do queueing).

In addition, I'd go for a blocking queue instead of your code above. The way your code works requires CPU processing power even if there is no work to do. A blocking queue lets your worker thread sleep whenever there is nothing to do. You can have multiple sleeping threads running without using CPU processing power.

C# doesn't come with a blocking queue implementation, but there a many out there. See this example and this one.

Community
  • 1
  • 1
tofi9
  • 5,775
  • 4
  • 29
  • 50
0

Another option for making thread-safe use of queues is the ConcurrentQueue<T> class that has been introduced since 2009 (the year of this question). This may help avoid having to write your own synchronization code or at least help making it much simpler.

From .NET Framework 4.6 onward, ConcurrentQueue<T> also implements the interface IReadOnlyCollection<T>.

Manfred
  • 5,320
  • 3
  • 35
  • 29