We are attempting to submit a large number of simultaneous web requests using HttpWebRequest for purposes of stress testing a server.
For this example, the web service method being tested simply pauses for 30 seconds and then returns allowing for queue depth and simultaneous request testing.
The client log shows 200 calls being queued up successfully in parallel within ~1 second using multiple calls to:
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri);
request.BeginGetResponse();
Fiddler shows the first ~80 being sent to server simultaneously, then before any initial responses have come back, Fiddler shows the remaining ~120 requests being added at about 1/second.
Following the suggestions from here have allowed an increase of the effective limit from about 10 to about 80. I wish to remove the next barrier without resorting to multiple machines/processes. Any ideas?
// Please excuse the blind setting of these values to find the other problem.
// These were set much higher but did not help
ThreadPool.SetMinThreads(30, 15);
ThreadPool.SetMaxThreads(30, 15);
System.Net.ServicePointManager.DefaultConnectionLimit = 1000;
and in App.Config
<system.net>
<connectionManagement>
<clear/>
</connectionManagement>
</system.net>
Another point is that when a large number of requests are performed and complete before the simple delay test, then the delay test is able to achieve 80 or so simultaneous. Starting in "cold" usually limits it to about 10 or so.