1

Originally trying to create an HTTP endpoint that would remain open for a long time (until a remote service executes and finished, then return the result to the original caller), I hit some concurrency issues: this endpoint would only execute a small number of times concurrently (like 10 or so, whereas I'd expect hundreds if not more).

I then narrowed down my code to a test endpoint that merely returns after a certain amount of MS you give it via the URL. This method should, in theory, give maximum concurrency, but it doesn't happen neither when running under an IIS on a Windows 10 desktop PC nor when running on a Windows 2012 Server.

This is the test Web API endpoint:

[Route("throughput/raw")]
[HttpGet]
public async Task<IHttpActionResult> TestThroughput(int delay = 0)
{
    await Task.Delay(delay);
    return Ok();
}

And this is a simple test app:

class Program
{
    static readonly HttpClient HttpClient = new HttpClient();
    static readonly ConcurrentBag<long> Stats = new ConcurrentBag<long>();
    private static Process _currentProcess;

    private static string url = "http://local.api/test/throughput/raw?delay=0";

    static void Main()
    {
        // Warm up
        var dummy = HttpClient.GetAsync(url).Result;
        Console.WriteLine("Warm up finished.");
        Thread.Sleep(500);

        // Get current process for later
        _currentProcess = Process.GetCurrentProcess();

        for (var i = 1; i <= 100; i++)
        {
            Thread t = new Thread(Proc);
            t.Start();
        }

        Console.ReadKey();
        Console.WriteLine($"Total requests: {Stats.Count}\r\nAverage time: {Stats.Average()}ms");
        Console.ReadKey();
    }

    static async void Proc()
    {
        Stopwatch sw = Stopwatch.StartNew();
        sw.Start();
        await HttpClient.GetAsync(url);
        sw.Stop();
        Stats.Add(sw.ElapsedMilliseconds);
        Console.WriteLine($"Thread finished at {sw.ElapsedMilliseconds}ms. Total threads running: {_currentProcess.Threads.Count}");
    }
}

The results I get are these:

Warm up finished.
Thread finished at 118ms. Total threads running: 32
Thread finished at 114ms. Total threads running: 32
Thread finished at 130ms. Total threads running: 32
Thread finished at 110ms. Total threads running: 32
Thread finished at 115ms. Total threads running: 32
Thread finished at 117ms. Total threads running: 32
Thread finished at 119ms. Total threads running: 32
Thread finished at 112ms. Total threads running: 32
Thread finished at 163ms. Total threads running: 32
Thread finished at 134ms. Total threads running: 32
...
...
Some more
...
...
Thread finished at 4511ms. Total threads running: 32
Thread finished at 4504ms. Total threads running: 32
Thread finished at 4500ms. Total threads running: 32
Thread finished at 4507ms. Total threads running: 32
Thread finished at 4504ms. Total threads running: 32
Thread finished at 4515ms. Total threads running: 32
Thread finished at 4502ms. Total threads running: 32
Thread finished at 4528ms. Total threads running: 32
Thread finished at 4538ms. Total threads running: 32
Thread finished at 4535ms. Total threads running: 32

So:

  1. I'm not sure why are there only 32 threads running (I assume it's related to the number of cores on my machine although sometimes the number is 34 and anyway it should be much more I think).

  2. The main issue I'm trying to tackle: The running time goes up as more calls are created, whereas I'd expect it to remain relatively constant.

What am I missing here? I'd expect an ASP.NET site (API in this case but it doesn't matter), running on a Windows Server (so no artificial concurrency limit is applied) to handle all these concurrent requests just fine and not increase the response time. I believe the response time is increased because threads are capped on the server side so subsequent HTTP calls wait for their turn. I'd also expect more than 32/34 threads running on the client (test) application.

I also tried to tweak machine.config without much success but I think that even the default should give much more throughput.

Ofer Zelig
  • 17,068
  • 9
  • 59
  • 93
  • I think that again is the lock on the session, do you have read that answer ? http://stackoverflow.com/questions/11629600/does-asp-net-web-forms-prevent-a-double-click-submission/11629664#11629664 – Aristos Nov 02 '16 at 00:31
  • @Aristos, no, it's not a browser issue or so. Pure HTTP client and server. – Ofer Zelig Nov 02 '16 at 03:09
  • apparently you did not understand what I say on the link. Read it again, nothing to do with browser – Aristos Nov 02 '16 at 08:06
  • You were talking about a "session" all over the place. Which session? I believe this is not quite explained. – Ofer Zelig Nov 02 '16 at 11:42
  • The server side is keep a session for the user, so to keep the data correctly is lock all calls until they end. Don't you know the session module on asp.net ? https://msdn.microsoft.com/en-us/library/ms178581.aspx – Aristos Nov 02 '16 at 11:54
  • Look, I am try to help, and from what I see I am the only one that vote you. I hope that I understand what you ask, and what I say you is that asp.net on server side (not on client side call) is keep a synchronize on the users session - and this prevent high concurrency - except if you disable session. If you disable session for that calls on server side, then asp.net not use any synchronization and all calls are parallel. – Aristos Nov 02 '16 at 12:00
  • Aristos, you don't need to take offense. We're discussing professional matter here, it's not Facebook. The session on the server side, if enabled, does not prevent concurrency. It might be a specific WebForms issue in place to prevent rapid clicks and I don't know if that's the case (and am very happy to have not been programming WebForms for 8 years and not willing to even think of returning) but as a general concept ASP.NET Session does not block concurrency. My specific example was WebAPI. Actually @JohnWu was right with the limit actually being on the client side. But thanks anyway! – Ofer Zelig Nov 05 '16 at 15:21

2 Answers2

2

HTTP Client

The number of simultaneous HttpClient connections is limited by your ServicePointManager. If you believe this article, the default is 2. TWO!! So your requests are getting queued. You can increase the number by setting the DefaultConnectionLimit.

Threads

Edit of the OP: although factually true for thread pools, my question did not involve a usage of the thread pool. I'm leaving this here though for any future reference (with usages slightly different than the one demonstrated in the question) and with respect to the person who gave this answer.

There is a maximum number of threads in your default thread pool. The default is not preset; it depends on the amount of memory available and other factors, and is apparently 32 on your machine. See this article, which states:

Beginning with the .NET Framework 4, the default size of the thread pool for a process depends on several factors, such as the size of the virtual address space. A process can call the GetMaxThreads method to determine the number of threads.

You can, of course, change it.

Ofer Zelig
  • 17,068
  • 9
  • 59
  • 93
John Wu
  • 50,556
  • 8
  • 44
  • 80
  • Your answer is partially correct - the first part is. It can actually be configured in your client's app.config as well (and I might edit your answer to demonstrate this). But the second part is not quite the case, as I'm not using the thread pool (such as when you call `ThreadPool.QueueUserWorkItem`). I'm explicitly creating threads. – Ofer Zelig Nov 02 '16 at 03:14
  • The first part of your answer John was the right thing - ticking it as correct. Thanks! – Ofer Zelig Nov 05 '16 at 15:15
1

John's answer addresses setting the default connection limit. Additionally, don't use blocking threads at all; that way you won't need to care about the size of the thread pool. Your tester is I/O bound, not CPU bound. Your Proc already returns immediately, so just call it without a new thread. Change its return type to Task so you can tell when its deferred portion is done.

Then Main will go something like this:

public static async Task Main() {
    await HttpClient.GetAsync(url);
    await Task.Delay(500); // Wait for warm up.
    await Task.WhenAll(Enumerable.Range(0, 100).Select(_ => Proc()));
    // Print results here.
}
Edward Brey
  • 40,302
  • 20
  • 199
  • 253