0

I am new to play framework and scala in general. While trying to test and understand the differences between sync and async actions, using the following code:

package controllers

import play.api.mvc._
import play.api.libs.concurrent.Execution.Implicits._
import scala.concurrent.Future

object Application extends Controller {

  def async = Action.async {
    Logger.info("async start")
    val resultF = Future {
      Thread.sleep(2000)
      Logger.info("async end")
      Ok
    }
    Logger.info("non-blocking")
    resultF
  }

  def sync = Action {
    Thread.sleep(2000)
    Ok
  }

}

When running the application, I have 10 tabs in browser requesting "/async". My expectation was all request should take roughly 2 seconds to furfill and I will see in the log 10 "async start" entries followed by 10 "async end" entries.

However, the actual outcome was seeing "async start", "async end" 10 times. The next request did not start until the previous request has finished. It seems execution of async was blocking and could not handle concurrent requests at all.

My question is why does the system behave this way, and what specific changes to enable concurrent request handling.

2 Answers2

2

Using Action.async doesn't automatically mean you're not blocking. It all depends on whether you're using blocking API or not.

Thread.sleep is a blocking operation in your Future but you are not signaling to the ExecutionContext that you are doing so, so the behavior will vary depending on what ExecutionContext you use and how many processors your machine has. Your code works as expected with ExecutionContext.global

Here in both the cases you're using Thread.sleep(2000) which blocks the thread.

In both cases, the sleep call occurs in the action's thread pool (which is not optimal).

As stated in Understanding Play thread pools:

Play framework is, from the bottom up, an asynchronous web framework. Streams are handled asynchronously using iteratees. Thread pools in Play are tuned to use fewer threads than in traditional web frameworks, since IO in play-core never blocks.

Because of this, if you plan to write blocking IO code, or code that could potentially do a lot of CPU intensive work, you need to know exactly which thread pool is bearing that workload, and you need to tune it accordingly.

In your case you're just waiting for couple of seconds on both the cases which blocks the thread, where the default setting of parallelism factor is 1.

If you're blocking the thread you can use something like this:

 def async = Action.async {
    Logger.info("async start")
    val resultF = Future {
      blocking{
        Thread.sleep(2000)
        Logger.info("async end")
        Ok
      }
    }
    Logger.info("non-blocking")
    resultF
  }
thwiegan
  • 2,163
  • 10
  • 18
Atiq
  • 396
  • 1
  • 3
  • 10
0

The code works fine.

Use ab (ApacheBench) or other for sending concurrent requests.

> ab -c 5 -n 5 localhost:9000/async
This is ApacheBench, Version 2.3 <$Revision: 1757674 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking localhost (be patient).....done


Server Software:        
Server Hostname:        localhost
Server Port:            9000

Document Path:          /async
Document Length:        0 bytes

Concurrency Level:      5
Time taken for tests:   2.013 seconds
Complete requests:      5
Failed requests:        0
Total transferred:      375 bytes
HTML transferred:       0 bytes
Requests per second:    2.48 [#/sec] (mean)
Time per request:       2013.217 [ms] (mean)
Time per request:       402.643 [ms] (mean, across all concurrent requests)
Transfer rate:          0.18 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.0      0       0
Processing:  2008 2011   2.0   2011    2013
Waiting:     2007 2011   2.0   2011    2013
Total:       2008 2011   2.0   2011    2013

Percentage of the requests served within a certain time (ms)
  50%   2011
  66%   2012
  75%   2012
  80%   2013
  90%   2013
  95%   2013
  98%   2013
  99%   2013
 100%   2013 (longest request)

I sent 5 concurrent requests and all ended as expected (see above Time taken for tests: 2.013 seconds)

Andrzej Jozwik
  • 14,331
  • 3
  • 59
  • 68
  • Thanks for suggesting ApacheBench. I used your test, and confirmed it was working for me: The problem it seem is actually the browser, when making (GET) requests to the same url, chrome locks cache and blocking subsequent requests, see this question: [Chrome stalls when making multiple requests to same resource?](https://stackoverflow.com/questions/27513994/chrome-stalls-when-making-multiple-requests-to-same-resource) – Yiran Sheng Jun 22 '17 at 22:43