I've faced with the next issue related to web service request processing:
Preamble
I have
- Web api service hosted on IIS 7.0 on local machine
- Test harness console application on the same machine
and i'm trying to simulate web service load by hitting one with requests generated via test harness app. Test harness core code:
static int HitsCount = 40;
static async void PerformHitting()
{
{
await Task.WhenAll(ParallelEnumerable.Range(0, HitsCount)
.Select(_ => HitAsync())
.WithDegreeOfParallelism(HitsCount));
}
}
static async Task HitAsync()
{
// some logging skipped here
...
await new HttpClient().GetAsync(TargetUrl, HttpCompletionOption.ResponseHeadersRead);
}
Expectation
Logging shows that all HitAsync() calls are made simultaneously: each hit via HttpClients had started in
[0s; 0.1s] time frame (timings are roughly rounded here and below). Hence, I'm expecting to catch all these requests in approximately the same time frame on web service side.
Reality
But logging on the service side shows that requests grouped in bunches 8-12 request each and service catches these bunches with ~1 second interval. I mean:
[0s, 0.3s] <- requests #0-#10
[1.2s, 1.6s] <- requests #10-#20
...
[4.1s, 4.5s] <- request #30-#40
And i'm getting really long execution time for any significant HitCount
values.
Question
I suspect some kind of built-in service throttling mechanism or framework built-in concurrent connections limitation. Only I found related to such guesstimate is that, but i didn't get any success trying soulutions from there.
Any ideas what is the issue?
Thanks.