0

I have a web api project using async/await controllers and tasks.

I've noticed every request after the 6th one gets queued. To test this, I made an easy delayed action:

    [HttpGet()]
    [Route("maestros/prueba")]
    public async Task<IHttpActionResult> prueba()
    {

        await Task.Delay(5000);

        return await Task.FromResult(Ok(true));
    }

Then I called it from an AngularJS app. I clicked it 18 times, and the result are as shown below:

packs of 18 requests, processed in groups of 6, each one using 5s

I was hoping to get them all processed at once, even if each took a small performance hit, but instead I am facing processing blocks of 6 requests.

I have tried disabling session at controller level and it doesn't work. I've looked everywhere for info, but it's not concluding and I don't really know what's happening.

For this project, it's a real problem, because there are certain parts where more the 6 requests are issued, and everything is slowed down.

What is happening? Can I change it?

slee423
  • 1,307
  • 2
  • 20
  • 34
MWS
  • 3
  • 6
  • That Angular screenshot is most likely the browser limitation. Browsers can only do one http request at a time – zaitsman Sep 03 '18 at 13:12
  • Can you clarify, please? I don't understand how's that related to the fact that I get chunks of 6 processed requests. – MWS Sep 03 '18 at 13:35
  • 1
    @zaitsman actually I think they can make more than one. https://stackoverflow.com/a/30064610/5947043 gives a helpful list, and the screenshot above clearly demonstrates Chrome making 6 concurrent requests in each of the groups there (by which I mean a new "group" starts after the much larger jumps in start times), rather than 1, which backs up the stats in that answer (max connections-per-hostname in Chrome is listed as being 6). But you're right, it's a limitation of the browser rather than the server. – ADyson Sep 03 '18 at 13:36
  • From that screenshot you'll also clearly see request #7 starting just as request #1 ends. And #8 starting as #2 ends, etc etc. If you test your API using something like PostMan you'll be able to see that it can accept more requests concurrently. – ADyson Sep 03 '18 at 13:37
  • Are we really limited to 6 requests to a server in 2018? I already checked with postman, but requests are not concurrent, but sequential, so my only option was an angularjs app. – MWS Sep 03 '18 at 14:07
  • This has nothing to do with the server. It is totally "the browser only makes 6 concurrent requests". Why do you even ask about server here? Browser limitation. Usem multiple domains to bypass it. HTTP/2 can use multiple parallel requests, too, over one tcp connection. – TomTom Sep 03 '18 at 14:42
  • @MWS https://timbeynart.com/2017/03/14/use-postman-to-hammer-a-rest-api/ shows you how you can run some requests concurrently in PostMan. Alternatively, something like Apache jMeter can be used to do it. Anyway, it will only demonstrate what you've been told already - this is not the server's problem. – ADyson Sep 03 '18 at 14:51
  • @MWS I assume that browsers have such a limitation for a good reason - they all seem to have set a similar threshold. Maybe related to performance I guess, or to make browser-based DDOS attacks harder or something. To be honest a web app making more than that number of concurrent requests maybe needs a design rethink, although some apps which do do it get round the limits by using domain sharding (https://blog.stackpath.com/glossary/domain-sharding/). But I think you're more concerned about the capacity of your webserver? Maybe you need to research into specific load testing tools. – ADyson Sep 03 '18 at 14:52

2 Answers2

1

You fail to see the obvious. It is NOT WebApi only handling concurrent requests. It is your browser not sending more.

Max parallel http connections in a browser?

Basically Chrome will only ever use 6 concurrent open requests per domain. Want more - use multiple browers or multiple domains.

This has absolutely nothing to do with WebAPI (server side) and is purely a client side limitation that your browser of choice enforces.

TomTom
  • 61,059
  • 10
  • 88
  • 148
0

A better (and handy) tool to test this is Bombardier. It can fire as much requests as the server can handle.

All of our endpoints are async/await and easily serve more than your projected 6 from the browser.

JayRone
  • 97
  • 1
  • 6