0

I've faced with the next issue related to web service request processing:

Preamble
I have

  • Web api service hosted on IIS 7.0 on local machine
  • Test harness console application on the same machine

and i'm trying to simulate web service load by hitting one with requests generated via test harness app. Test harness core code:

    static int HitsCount = 40;

    static async void PerformHitting()
    {
        {
            await Task.WhenAll(ParallelEnumerable.Range(0, HitsCount)
                                                 .Select(_ => HitAsync())
                                                 .WithDegreeOfParallelism(HitsCount));
        }
    }

    static async Task HitAsync()
    {
        // some logging skipped here
        ...
        await new HttpClient().GetAsync(TargetUrl, HttpCompletionOption.ResponseHeadersRead);
    }

Expectation
Logging shows that all HitAsync() calls are made simultaneously: each hit via HttpClients had started in
[0s; 0.1s] time frame (timings are roughly rounded here and below). Hence, I'm expecting to catch all these requests in approximately the same time frame on web service side.

Reality
But logging on the service side shows that requests grouped in bunches 8-12 request each and service catches these bunches with ~1 second interval. I mean:

[0s, 0.3s] <- requests #0-#10
[1.2s, 1.6s] <- requests #10-#20
...
[4.1s, 4.5s] <- request #30-#40

And i'm getting really long execution time for any significant HitCount values.

Question
I suspect some kind of built-in service throttling mechanism or framework built-in concurrent connections limitation. Only I found related to such guesstimate is that, but i didn't get any success trying soulutions from there.
Any ideas what is the issue?

Thanks.

Community
  • 1
  • 1
buhtopuhta
  • 71
  • 5
  • Building a load test harness is a good practice when learning how they work but its almost always more productive to use an existing tool for such a job. Check out [SoapUI](http://www.soapui.org/) as an example (several out there). – M.Babcock Oct 30 '13 at 00:07

2 Answers2

1

By default, HTTP requests on ASP.NET are limited to 12 times the number of cores. I recommend setting ServicePointManager.DefaultConnectionLimit to int.MaxValue.

Stephen Cleary
  • 437,863
  • 77
  • 675
  • 810
  • Hello Stephen, thank you for reply. I've tried to change value of `ServicePointManager.DefaultConnectionLimit` both on Test harness side and Web service side. And i can see under debug that `ServicePointManager.DefaultConnectionLimit` really changed but service still can catch only ~10 concurrent requests. – buhtopuhta Oct 30 '13 at 11:05
  • As long as you're setting it before the first request, you shouldn't be limited on your client side. Do you have any limits on the server side? – Stephen Cleary Oct 30 '13 at 12:05
  • I set limitations on server side exatcly the same as on client side. As i can see the result is the same for limitations set like [here](http://msdn.microsoft.com/ru-ru/library/ms998549#scalenetchapt06_topic5) and for limitations like `DefaultConnectionLimit = 1`. Setting `processModel.maxIoThreads` in `machine.config` has no any result too. **[UPD]** I'm wrong. Setting `DefaultConnectionLimit = 1` in runtime (not in config) on server side changes the behavior: i get first 10 requests in bunch and the rest 1 pre second – buhtopuhta Oct 30 '13 at 12:11
0

Well, the root of the problems lies in the IIS + Windows 7 concurrent requests handling limit (some info about such limits here. Moving service out to the machine with Windows Server kicked out the problem.

Community
  • 1
  • 1
buhtopuhta
  • 71
  • 5