0

In common way, if the java webserver wants to serve (Jetty, ...) for multiple requests, the server has to open a thread for each request. And if you have a large amount of requests, you have to put the thread into a pool to keep the server works without dead. The disadvantage of this way is the server consumes a lot of resource and the client have to wait in the queue of the thread pool.
So some newer versions of Jetty support Asynchronous Servlet, that allow server saves the information of all request without keeping the threads wait for the event to send back the response to client. You can find out at this link
I have some lines of code:

public class AsyncServlet extends HttpServlet implements OnEventCome {

List<AsyncContext> listContext = new ArrayList<>();

@Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp)
        throws ServletException, IOException {
    AsyncContext ctxt = req.startAsync();
    ctxt.setTimeout(3600000l);
    synchronized (listContext) {
        listContext.add(ctxt);
    }
}

@Override
public void onEventCome() {
    synchronized (listContext) {
        for (AsyncContext ctxt: listContext) {
            ctxt.start(new Runnable() {
                @Override
                public void run() {
                    ctxt.complete();
                }
            });
        }
        listContext.clear();
    }
}
}

The function doPost gets client's request information and finish immediately. Event onEventCome will wait for an event to send all responses to all clients.
My question is: How can the servlet can make all clients keep connection and wait for response without open any threads?

Bui Minh Duc
  • 112
  • 1
  • 11
  • Jetty uses a Thread Pool, not for (memory/heap) resource usage reasons, but for slow Thread Startup reasons. As for how much memory a thread uses, you might want to see the answer at https://stackoverflow.com/questions/36898701/how-does-java-jvm-allocate-stack-for-each-thread – Joakim Erdfelt Jun 12 '18 at 22:00

1 Answers1

0

The asynchronous way still uses threads, but if the resources are not available it puts everything into a pool rather than rejecting the connection immediately.

The link you provided gives an answer in the last section, I have added emphasis:

Asynchronous Request handling

The Jetty Continuation (and the servlet 3.0 asynchronous) API introduce a change in the servlet API that allows a request to be dispatched multiple times to a servlet. If the servlet does not have the resources required on a dispatch, then the request is suspended (or put into asynchronous mode), so that the servlet may return from the dispatch without a response being sent. When the waited-for resources become available, the request is re-dispatched to the servlet, with a new thread, and a response is generated.

sorifiend
  • 5,927
  • 1
  • 28
  • 45
  • 1
    Its same as what in our daily life we are seeing in the hotels, The hotel has many tables and many customers will occupy the tables and they will simply place their order to the limited number of waiters so a waiter can serve more than one table and it he is not some other waiter could serve the order that table. thats the main concept of Async way of handling connections. – Hakuna Matata Jun 12 '18 at 11:10
  • Tip: you can save even more held threads by using Async I/O (from Servlet 3.1+) as well. That way the thread is only allocated to a servlet when it wouldn't cause a blocking I/O operation. – Joakim Erdfelt Jun 12 '18 at 21:51