3

I'm using a ConcurrentQueue to enqueue items from an I/O bound task and dequeue them from another for processing. I stop adding items to the queue when it reaches a certain size, so the processing can catch up. To do this I check the ConcurrentQueue.Count property.

The problem is that the Count property doesn't seem to behave as it does in a List or other collections. It's extremely slow, and the larger the queue the slower it takes to read the Count property. With 20k items in a ConcurrentQueue almost all processor time is spent on the Count property.

Rough Example:

        while (reader.Read())
        {
            if(Queue.Count >= MaxQueueSize)
            {
                //Wait
            }
            //Do Stuff
        }

When running a performance profiler, all the time is spent on System.Collections.Concurrent.CDSCollectionETWBCLProvicer.ctor().

This only seems to occur on .NET Core 2, this does not occur in .NET 4.6.2

Is there any way around this in .Net Core?

Douglas Gaskell
  • 9,017
  • 9
  • 71
  • 128
  • Hi Evk, thanks for the info. This is a non-issue on the .NET Framework. Only problematic on .NET core. Are you sure getting a `Count` requires visiting all items? – Douglas Gaskell May 17 '18 at 03:42
  • I'll check out `BlockingCollection`, thanks! That won't have serious impacts on performance when taking items in a FIFO fashion? – Douglas Gaskell May 17 '18 at 03:45
  • Update: Blocking collection worked great. If you add this as an answer I'll accept it. – Douglas Gaskell May 17 '18 at 03:54
  • Possible duplicate of [Thread safe limited size queue](https://stackoverflow.com/questions/32435318/thread-safe-limited-size-queue) or https://stackoverflow.com/questions/12410777/producer-consumer-pattern-with-a-fixed-size-fifo-queue – mjwills May 17 '18 at 03:58

1 Answers1

5

Since source code of both frameworks is available nowadays, one can take a peek at source code for both Count versions. For full .NET it's here, and for .NET Core it's here. You can see that:

  1. Both versions are non-trivial. It's not something like return _count.

  2. .NET Core version is quite more complicated, and it includes slow paths where whole queue is freezed just to calculate stable count.

So it's not surprising that .NET Core version is generally slower, but main point is - both of them are non-trivial and using them in a tight loop is not a good idea. In addition to that - you have a race condition by checking that Count, because it might be changed by another thread immediately after you check it, so queue is not really guaranteed to contain less than max count in your version.

Instead, use built-in capacity bounded collections, such as BlockingCollection:

var x = new BlockingCollection<string>(new ConcurrentQueue<string>(), MaxQueueSize);
Evk
  • 98,527
  • 8
  • 141
  • 191