6

I need to perform some tasks(Mostly Call multiple External URL's with request parameters and read data) concurrently in java servlet and send response to user within a few seconds.I am trying to use ExecutorService to achieve the same. I need four FutureTasks created in each user request in the doGet method. Each Task runs for around 5-10 sec and the total response time to the user is around 15 sec.

Can you please suggest which of the following design is better while using ExecutorService in a Java servlet?

1)(Creating newFixedThreadPool per request and shutting it down ASAP)

public class MyTestServlet extends HttpServlet
{

    ExecutorService myThreadPool = null;

    public void init()
    {
          super.init();

    }
    protected void doGet(HttpServletRequest request,HttpServletResponse response)
    {

        myThreadPool = Executors.newFixedThreadPool(4);
        taskOne   = myThreadPool.submit();
        taskTwo   = myThreadPool.submit();        
        taskThree = myThreadPool.submit();
        taskFour  = myThreadPool.submit();

        ...
        ...

        taskOne.get();
        taskTwo.get();
        taskThree.get();
        taskFour.get();

        ...

        myThreadPool.shutdown();


    }

     public void destroy()
     {

         super.destroy();
     }

}

2) (Creating newFixedThreadPool during Servlet Init and shutting it down on servlet destroy)

public class MyTestServlet extends HttpServlet
{

    ExecutorService myThreadPool = null;

    public void init()
    {
      super.init();
          //What should be the value of fixed thread pool so that it can handle multiple   user requests without wait???
          myThreadPool = Executors.newFixedThreadPool(20);

    }
    protected void doGet(HttpServletRequest request,HttpServletResponse response)
    {


        taskOne   = myThreadPool.submit();
        taskTwo   = myThreadPool.submit();        
        taskThree = myThreadPool.submit();
        taskFour  = myThreadPool.submit();

        ...
        ...

        taskOne.get();
        taskTwo.get();
        taskThree.get();
        taskFour.get();

        ...



    }

     public void destroy()
     {

          super.destroy();
          myThreadPool.shutdown();
     }

}

3) (Creating newCachedThreadPool during Servlet Init and shutting it down on servlet destroy)

public class MyTestServlet extends HttpServlet
{

      ExecutorService myThreadPool = null;

      public void init()
      {
        super.init();
            myThreadPool = Executors.newCachedThreadPool();

      }
      protected void doGet(HttpServletRequest request,HttpServletResponse response)
      {


          taskOne   = myThreadPool.submit();
          taskTwo   = myThreadPool.submit();        
          taskThree = myThreadPool.submit();
          taskFour  = myThreadPool.submit();

          ...
          ...

          taskOne.get();
          taskTwo.get();
          taskThree.get();
          taskFour.get();

          ...




     }

     public void destroy()
     {

            super.destroy();
            myThreadPool.shutdown();
      }

}
Bhesh Gurung
  • 50,430
  • 22
  • 93
  • 142
user1263019
  • 221
  • 5
  • 14
  • The container creates and loads a single instance of the servlet. And all the requests are processed by the same instance. So, `ExecutorService myThreadPool = null;` is not safe. – Bhesh Gurung Aug 02 '12 at 21:54
  • So can please suggest how to declare ExecutorService globally? – user1263019 Aug 03 '12 at 00:00

2 Answers2

1

The first should not be an option. The idea of a thread pool (and probably any pool) is to minimize the overhead and memory required for the construction of the pool members (in this case, the worker threads). so In general the pools should be inited when your application is started and destroyed when it shuts down.

As for the choice between 2 and 3, please check the accepted answer in the following post. The answer explains the difference and you can then decide which one suits your needs better : newcachedthreadpool-v-s-newfixedthreadpool

Community
  • 1
  • 1
  • Thanks for the answer.Based on the link you had suggested newcachedthreadpool seems suitable as my tasks are tied to a http request and complete within few seconds. – user1263019 Aug 03 '12 at 00:02
0

Creating and destroying a thread pool for each request is a bad idea : too expensive.

If you have some way to remember which HTTP request each URL fetching task is related to, I'd go for a CachedThreadPool. Its ability to grow and shrink on-demand will do wonders, because the URL fetching tasks are totally independant and network-bound (as opposed to CPU or memory-bound).

Also, I would wrap the ThreadPool in a CompletionService, which can notify you whenever a job is done, regardless of its submission order. First completed, first notified. This will ensure you don't block on a sloooow job if faster ones are already done.

CompletionService is easy to use : wrap it around an existing ThreadPool (newCachedThreadPool for example), submit() jobs to it, and then take() the results back. Please note that the take() method is blocking.

http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/CompletionService.html

Olivier Croisier
  • 6,139
  • 25
  • 34
  • Thanks for the answer. Can you please elaborate "If you have some way to remember which HTTP request each URL fetching task is related to" . Is there any chance for mis-match between task and request if I use cachedthreadpool ? Even though threadpool is same for each request, task will be new for each request and I can retrieve using .get() method once it completes right? – user1263019 Aug 03 '12 at 00:08
  • 1
    Hummm I changed my mind - a completion service would require you to have one thread block on the take() method, and then redispatch the resulting Future to the originating request, which would be quite complicated. You'd better use invokeAll() on a standard ExecutorService : submit it a list of jobs, and it gives you back a list of results. Easier and more efficient. – Olivier Croisier Aug 03 '12 at 08:47