6

When using Netbeans and writing an arbitrary REST endpoint, NetBeans always displays a warning that the method can be converted to asynchronous.

For example, I create the following method:

@GET
@Path("/test")
public String hello() {
    return "Hello World!";
}

NetBeans then shows a warning, see below:

Convert to asynchronous

Clicking on the tooltip generates this code:

private final ExecutorService executorService = java.util.concurrent.Executors.newCachedThreadPool();

@GET
@Path(value = "/test")
public void hello(@Suspended final AsyncResponse asyncResponse) {
    executorService.submit(new Runnable() {
        @Override
        public void run() {
            asyncResponse.resume(doHello());
        }
    });
}

private String doHello() {
    return "Hello World!";
}

The same holds true when creating a PUT or POST method. Since NetBeans always shows a warning when a REST endpoint is implemented, this tells me that writing synchronous endpoints is considered wrong/bad practice. So, should every REST endpoint be asynchronous? Why?

Jakob
  • 1,156
  • 3
  • 15
  • 26
  • what would happen if two users request simultaneous access to the same resource? – mr mcwolf Oct 08 '17 at 13:52
  • As far as I understood, the server has a number of threads available in its thread pool. Whenever a new request comes in, the server assigns one of its threads to this request and releases the thread to the pool after the request has been handled (no matter which resource is accessed). This also means that JAX-RS is thread safe by default. – Jakob Oct 08 '17 at 14:06
  • what's your netbeans version? – Ori Marko Oct 09 '17 at 05:15
  • My netbeans version is 8.1 – Jakob Oct 09 '17 at 06:33

2 Answers2

4

Sync

  • The work is done in the same thread as the IO. All threads are in the same pool.
  • Accepting a request takes 100 ms; doing the work takes 900 ms; total 1 s.
  • If you want to be able to accept 100 requests / second you need 100 IO threads.
  • If you want to be able to fulfill 100 requests / second you need the same 100 IO threads.

Async

  • The work is done in a different thread than the IO. You have two different thread pools.
  • Accepting a request still takes 100 ms; total 100 ms.
  • If you want to be able to accept 100 req/s you only need 10 IO threads in your pool.
  • Work still takes 900 ms; total 900 ms.
  • If you want to be able to fulfill 100 requests / second you need 90 Worker threads.

The initial number of threads in both scenarios is the same, yes.
However, IO threads and Worker threads usually have different requirements:

  • Want your IO threads to be kept alive so requests can be handled quicker?
    • With sync / one pool you have to do that for all 100 threads; with async / several pools you can do that just for the 10 IO threads.
  • Have some tasks that take longer?
    • With sync you have to increase IO pool size to keep up; with async you can still take 100 req/s with your 10 IO threads and either increase the size of the Worker thread pool, or return 503 / 429 / 269 to signal the overload, or even create several Worker thread pools with different properties to better suit your work load; speaking of which...
  • Want to benefit from using different types of thread pools for different types of tasks?
    • In async mode you're free to create several pools with different configurations and use the most appropriate for each task, while keeping your IO thread pool alone. In sync mode with just one pool, you just can't do that.

For a simple app it doesn't really matter if you make your endpoints sync or async; but in the general case, with a decent number of requests per second and different tasks with different traits (processing time, need of spawning their own child threads, priority), making your endpoints asynchronous is the best way to have a highly responsive system while making an efficient use of resources.

walen
  • 7,103
  • 2
  • 37
  • 58
2

There are two main dimensions to service performance, from the client's perspective. A service client cares about:

  1. Throughput, that is how many concurrent requests can your service handle?

  2. Latency, that is how long does one request have to wait before it gets a response?

It's easy to be tempted to care about the throughput alone but at a certain point, trying to achieve throughput will start to negatively affect latency. Multithreading isn't free at some scale.

The Netbeans hint will help only with throughput, but like you've deduced, it does nothing to help with latency - that's entirely on you, bud. If throughput isn't a concern for your application at this point, you can safely ignore the hint.

There's no requirement or even rule of thumb that all JAX-RS services be async, that's just a leap on the part of Netbeans development team (a pretty confident leap too, considering this is one hint they didn't think we'd ever need to disable.

Jersey does provide some recommendations to work around the limitations of latency but the fact remains that there's no magic to it: a server-side operation will take as long as it takes and the client doesn't have a choice but to wait for it to complete, barring the implementation of some call-back pattern.

kolossus
  • 20,559
  • 3
  • 52
  • 104