1

My SPA web client has a page that makes ~30 requests simulateously to the same (light) endpoint in my REST API (.Net ASP Core 6).

Currently, my configuration seems to only handle 5 concurrent requests at a time, the others are in a queue. The consequence is a very slow user experience and under usage of my server resources (CPU mainly).

I have not (as far as I understand it) configured anything explicitely. My guess is that 5 is the default setting.

The Host builder is using default values:

public static IHostBuilder CreateHostBuilder(string[] args) =>
        Host.CreateDefaultBuilder(args)
            .ConfigureWebHostDefaults(webBuilder =>
            {
                webBuilder.UseStartup<Startup>();
                webBuilder.UseSentry();
            })
            .ConfigureLogging(logging =>
            {
                logging.ClearProviders();
                logging.SetMinimumLevel(LogLevel.Trace);
            })
            .UseNLog();  // Setup NLog for Dependency injection;

The web.config file does not have any specific config regarding the topic at hand:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
    <system.webServer>
        <security>
            <requestFiltering>
                <requestLimits maxAllowedContentLength="209715200" />
            </requestFiltering>
        </security>
    </system.webServer>
</configuration>

My PUT endpoint is fairly basic:

[HttpPut("{id}/Foo")]
public async Task<ResultDto> GetFoo(int id, QueryDto query, CancellationToken cancellationToken) { ... }

I found lot of resources regarding legacy ASP.Net but nothing of value for ASP Core 6 (eventhough this answer confuses me)

How can I configure ASP Core to allow more concurrent requests of the same endpoint?

PS: I experience the same limitation to 5 concurrent requests in both localhost and when deployed on Azure AppService (linux).

Askolein
  • 3,250
  • 3
  • 28
  • 40
  • 2
    There is nothing by default 'limiting' the amount of requests your asp.net core (6) application can handle - the entire framework is designed to handle thousands of (concurrent) requests with very little threads. So usually means that you have a bottleneck elsewhere in your application. Have you tried seeing if having a basic endpoint returning a simple status code is also affected by this limit you're seeing? – nbokmans May 04 '22 at 09:46
  • The issue is likely to be with the implementation of your `GetFoo()` endpoint, which you've not provided the code for. The defaults for ASP.NET Core support a very high number of concurrent requests. Alternatively, the machine you're running the code on is of a limited specification or has poor networking. – Martin Costello May 04 '22 at 09:54
  • Could it be related to the CPU core count of your computer? – Bayram Eren May 04 '22 at 09:56

1 Answers1

0

Thanks to the comments posted above by everyone, it appears it's not an API (ASP Core) issue.

The browsers have limited concurrent open connection count (to a given server). There are leads to avoid that behaviour.

The solution is also to rewrite my endpoint to have it pool the results instead of one-by-one and thus lower the amount of actual requests made by the frontend.

Askolein
  • 3,250
  • 3
  • 28
  • 40