35

I see these implementation of BlockingQueue and can't understand the differences between them. My conclusion so far:

  1. I won't ever need SynchronousQueue
  2. LinkedBlockingQueue ensures FIFO, BlockingQueue must be created with parameter true to make it FIFO
  3. SynchronousQueue breaks most collections method (contains, size, etc)

So when do I ever need SynchronousQueue? Is the performance of this implementation better than LinkedBlockingQueue?

To make it more complicated... why does Executors.newCachedThreadPool use SynchronousQueue when the others (Executors.newSingleThreadExecutor and Executors.newFixedThreadPool) use LinkedBlockingQueue?

EDIT

The first question is solved. But I still don't understand why does Executors.newCachedThreadPool use SynchronousQueue when the others (Executors.newSingleThreadExecutor and Executors.newFixedThreadPool) use LinkedBlockingQueue?

What I get is, with SynchronousQueue, the producer will be blocked if there is no free thread. But since the number of threads is practically unlimited (new threads will be created if needed), this will never happen. So why should it uses SynchronousQueue?

nanda
  • 24,458
  • 13
  • 71
  • 90
  • 2
    possible duplicate of [Java Queue implementations, which one ?](http://stackoverflow.com/questions/1301691/java-queue-implementations-which-one) – aioobe Feb 24 '11 at 09:18
  • here's some post on different queuing stategies http://letslearnjavaj2ee.blogspot.ru/2013/09/queuing-strategies-for.html – Valya May 17 '17 at 09:14

3 Answers3

55

SynchronousQueue is a very special kind of queue - it implements a rendezvous approach (producer waits until consumer is ready, consumer waits until producer is ready) behind the interface of Queue.

Therefore you may need it only in the special cases when you need that particular semantics, for example, Single threading a task without queuing further requests.

Another reason for using SynchronousQueue is performance. Implementation of SynchronousQueue seems to be heavily optimized, so if you don't need anything more than a rendezvous point (as in the case of Executors.newCachedThreadPool(), where consumers are created "on-demand", so that queue items don't accumulate), you can get a performance gain by using SynchronousQueue.

Simple synthetic test shows that in a simple single producer - single consumer scenario on dual-core machine throughput of SynchronousQueue is ~20 time higher that throughput of LinkedBlockingQueue and ArrayBlockingQueue with queue length = 1. When queue length is increased, their throughput rises and almost reaches throughput of SynchronousQueue. It means that SynchronousQueue has low synchronization overhead on multi-core machines compared to other queues. But again, it matters only in specific circumstances when you need a rendezvous point disguised as Queue.

EDIT:

Here is a test:

public class Test {
    static ExecutorService e = Executors.newFixedThreadPool(2);
    static int N = 1000000;

    public static void main(String[] args) throws Exception {    
        for (int i = 0; i < 10; i++) {
            int length = (i == 0) ? 1 : i * 5;
            System.out.print(length + "\t");
            System.out.print(doTest(new LinkedBlockingQueue<Integer>(length), N) + "\t");
            System.out.print(doTest(new ArrayBlockingQueue<Integer>(length), N) + "\t");
            System.out.print(doTest(new SynchronousQueue<Integer>(), N));
            System.out.println();
        }

        e.shutdown();
    }

    private static long doTest(final BlockingQueue<Integer> q, final int n) throws Exception {
        long t = System.nanoTime();

        e.submit(new Runnable() {
            public void run() {
                for (int i = 0; i < n; i++)
                    try { q.put(i); } catch (InterruptedException ex) {}
            }
        });    

        Long r = e.submit(new Callable<Long>() {
            public Long call() {
                long sum = 0;
                for (int i = 0; i < n; i++)
                    try { sum += q.take(); } catch (InterruptedException ex) {}
                return sum;
            }
        }).get();
        t = System.nanoTime() - t;

        return (long)(1000000000.0 * N / t); // Throughput, items/sec
    }
}    

And here is a result on my machine:

enter image description here

Community
  • 1
  • 1
axtavt
  • 239,438
  • 41
  • 511
  • 482
  • how you managed to get that graph? i really like to see such performance via this graph. – Sukh Jun 08 '20 at 17:03
5

Currently the default Executors (ThreadPoolExecutor based) can either use a set of pre-created threads of a fixed size and a BlockingQueue of some size for any overflow or create threads up to a max size size if (and only if) that queue is full.

This leads to some surprising properties. For instance, as additional threads are only created once the queue's capacity is reached, using a LinkedBlockingQueue (which is unbounded) means that new threads will never get created, even if the current pool size is zero. If you use an ArrayBlockingQueue then the new threads are created only if it is full, and there is a reasonable likelihood that subsequent jobs will be rejected if the pool hasn't cleared space by then.

A SynchronousQueue has zero capacity so a producer blocks until a consumer is available, or a thread is created. This means that despite the impressive looking figures produced by @axtavt a cached thread pool generally has the worst performance from the producer's point of view.

Unfortunately there isn't currently a nice library version of a compromise implementation that will create threads during bursts or activity up to some maximum from a low minimum. You either have a growable pool or a fixed one. We have one internally, but it isn't ready for public consumption yet.

Jed Wesley-Smith
  • 4,686
  • 18
  • 20
  • Well, if what are you saying is correct, then it's weird. I use cached thread pool with LinkedBlockingQueue and it works. How do you explain that? – nanda Feb 25 '11 at 08:19
  • "This means that despite the impressive looking figures produced by @axtavt a cached thread pool generally has the worst performance from the producer's point of view." This is exactly what I comments in @Peter's answer – nanda Feb 25 '11 at 08:49
  • 2
    What do you mean you use cached thread pool with LinkedBlockingQueue? They all use the same ThreadPoolExecutor under the hood, and it creates threads if and only if there (a) aren't corePoolSize threads already running (b) isn't a thread available to process a job (c) the job is rejected when offered to the queue and (d) the maximumPoolSize hasn't been reached. Otherwise it rejects. So, you will get thread creation beyond the core size only if you construct with a LinkedBlockingQueue whose max-capacity is significantly less than Integer.MAX_VALUE. It will only grow when the queue is full. – Jed Wesley-Smith Feb 28 '11 at 05:07
  • "blocks until a consumer is available" is incorrect. If we've reached the maxPoolSize, and all threads are busy, when a new task is submitted, behavior will depend on the RejectedExecutionHandler. @jed-wesley-smith correct me if I'm wrong. – Denny Abraham Cheriyan Jun 21 '19 at 19:05
  • @DennyAbrahamCheriyan I was describing the SynchronousQueue class specifically, which blocks when using the offer method with a timeout. That however is not how the ThreadPoolExecutor class uses it, it uses the immediate offer() that when it fails attempts to add another worker thread until max workers reached, where it does indeed reject as you mention. The default Exectors.newCachedTheadPool() factory method sets max to Integer.MAX_VALUE so it will never be reached. – Jed Wesley-Smith Jan 08 '20 at 02:39
4

The cache thread pool creates threads on demand. It needs a queue which either passes the task to a waiting consumer or fails. If there is no waiting consumer, it creates a new thread. SynchronousQueue does not hold an element, instead it passes on the element or fails.

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130
  • Can you make this sentence clearer: 'SynchronousQueue supports this by passing a task to a wait consumer or failing if none so the pool can create another thread. i.e. it never holds a task.' I still don't understand that. Thanks – nanda Feb 24 '11 at 09:30
  • So what is the difference between: return new ThreadPoolExecutor(0, Integer.MAX_VALUE, 60L, TimeUnit.SECONDS, new SynchronousQueue()) and return new ThreadPoolExecutor(0, Integer.MAX_VALUE, 60L, TimeUnit.SECONDS, new LinkedBlockingQueue())? The first one is implementation of newCachedThreadPool. So far I understand only that the 1st one blocks producer if there is no free thread, but since the number of thread is 'unlimited', new Thread will be created automatically. – nanda Feb 24 '11 at 10:04
  • AFAIK the second example will only add a new thread when the queue is full (which is never). It is possible it won't create even the first thread as the minimum is 1. – Peter Lawrey Feb 24 '11 at 10:15
  • I've tried it and it works as cached thread pool that I expected. – nanda Feb 24 '11 at 10:20
  • It that case, it is not actually queuing anything ;) – Peter Lawrey Feb 24 '11 at 10:23
  • yes, so that is the question. Why should it uses SynchronousQueue (which doesn't even ensure FIFO) – nanda Feb 24 '11 at 10:28
  • How can you have FIFO when there is no in or out? Every task is passed to another thread almost immediately. Using the LinkedBlocking queue doesn't guarentee that tasks will be started in order, only removed from the queue in order. If the queue is always empty, there is no order. – Peter Lawrey Feb 24 '11 at 10:35
  • @Peter: so back to the original question, why even use SynchronousQueue if the same behavior can be simulated using LinkedBlockingQueue. This make the behavior of newCachedThreadPool differs from the others because the submit process waits until the Thread is created. Even if it is so fast, it will still be matter in certain cases. – nanda Feb 24 '11 at 15:39
  • If there is a thread waiting SynchronousQueue is much faster as it is lighter. LinkedBlockingQueue, even ArrayBlockingQueue has to create objects to add/take elements to/from the queue. Why ArrayBlockingQueue does this is another question. – Peter Lawrey Feb 24 '11 at 16:02