0

I created a .Net Framework 4.7.2 console app that concurrently makes many requests in an API hosted in AWS. My problem is that the requests are taking too long.

The API's response time is usually 100ms-400ms according to the taget group monitoring in AWS console but in my application the time elapsed of each request starts at 1 second and keeps increasing until 11 seconds.

I'm already aware that HttpClient doesn't close connections properly so we shouldn't use using and instead always use one instance for each application.

I already found a similar question but the answer didn't solve it.

When I set MaxDegreeOfParallelism to 1, the response time in the application is similar to the app. This seem to be problem that occurs in HttpClient in a multi thread.

This is how i'm doing the requests :

public static class RequestMaker
{
    private static readonly string _urlHttp = "http://apidomain.com/api/apiname";
    private static readonly HttpClient _httpClient = new HttpClient();
    public static async Task<string> PostAsync(string postData)
    {
        bool IsSuccessStatusCode = false;
        int maxRetries = 5;
        int count = 0;
        do
        {
            try
            {
                Stopwatch watcher = Stopwatch.StartNew();
                using (HttpContent content = new StringContent(postData, Encoding.UTF8, "application/json"))
                using (HttpResponseMessage result = await _httpClient.PostAsync(_urlHttp, content).ConfigureAwait(false))
                {
                    watcher.Stop();
                    Console.WriteLine("Elapsed = " + watcher.ElapsedMilliseconds.ToString("N0"));
                    IsSuccessStatusCode = result.IsSuccessStatusCode;
                    if (IsSuccessStatusCode)
                        return await result.Content.ReadAsStringAsync().ConfigureAwait(false);

                    count++;
                    if (count > maxRetries)
                        return "";

                    Console.WriteLine($"Retrying request because of request status code {result.StatusCode}");
                }
            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.Message);
                count++;
                if (count > maxRetries)
                    return "";
            }
        } while (!IsSuccessStatusCode);

        return "";
    }
}

This is my function calling the request concurrently :

static void RunBatchMany(List<string> list)
{
    var getCustomerBlock = new TransformBlock<string, long>(
        async lstRec =>
        {
            ApiInputObject apiInput = new ApiInputObject();
            
            // PrepareInputObject
            string postData = JsonConvert.SerializeObject(apiInput);

            Stopwatch watcher = Stopwatch.StartNew();
            string json = await RequestMaker.PostAsync(postData);
            ApiResponseObject res = JsonConvert.DeserializeObject<ApiResponseObject>(json);
            watcher.Stop();
            return watcher.ElapsedMilliseconds;

        }, new ExecutionDataflowBlockOptions
        {
            MaxDegreeOfParallelism = 8
        });

    foreach (var id in list)
        getCustomerBlock.Post(id);

    getCustomerBlock.Complete();
    getCustomerBlock.Completion.Wait();
}
Peter Csala
  • 17,736
  • 16
  • 35
  • 75
Carlos Siestrup
  • 1,031
  • 2
  • 13
  • 33
  • `if (count < maxRetries)` should probably be `if (count > maxRetries)` – Fildor Feb 03 '23 at 15:29
  • Thanks for the observation. Just fixed it. – Carlos Siestrup Feb 03 '23 at 15:45
  • Could you try with `int maxRetries = 1`, and see if the problem persists? – Theodor Zoulias Feb 03 '23 at 16:37
  • @Theodor Zoulias I'm printing the request time at every try so I don't think it's a problem with the retry policy. Also, there were no error messages so every request was over in the first try. I edited my question because I noticed that the response time is normal when MaxDegreeOfParallelism is set to to 1. It seems to be a conccurency issue, maybe a problem that HttpClient has in multi threads applications? – Carlos Siestrup Feb 03 '23 at 16:52
  • What .NET platform are you targeting? .NET Core and later or .NET Framework? – Theodor Zoulias Feb 03 '23 at 16:54
  • I'm using .Net Framework 4.7.2 – Carlos Siestrup Feb 03 '23 at 16:57
  • 1
    How do you know the problem is the client and not the server (e.g. rate limiting, resource exhaustion, lock contention, etc.)? – John Wu Feb 03 '23 at 17:30
  • These questions might be relevant: [How can I programmatically remove the 2 connection limit in WebClient](https://stackoverflow.com/questions/866350/how-can-i-programmatically-remove-the-2-connection-limit-in-webclient), and also [How to increase the outgoing HTTP requests quota in .NET Core?](https://stackoverflow.com/questions/55372354/how-to-increase-the-outgoing-http-requests-quota-in-net-core) – Theodor Zoulias Feb 03 '23 at 19:33

1 Answers1

0

Try to isolate the await logic and make only one call to the URL without using a loop :

HttpResponseMessage result = await _httpClient.PostAsync(_urlHttp, content).ConfigureAwait(false)

Make a Unit Test out of that call. Call it once. Is the HttpResponse received in the suggested time? If Yes: Then slowly add the loop and the other logic things around. You use a lot of async programming, but why? For what exactly dou you need it? Why do you use a "static" Request Maker?

Uwe Köhler
  • 123
  • 1
  • 7
  • I edited my question because I noticed the response time gets normal when I set MaxDegreeOfParallelism to 1. I think the issue is with the parallelism rather than the loop. – Carlos Siestrup Feb 03 '23 at 16:53
  • @CarlosSiestrup: Is it possible that the problem is really just the time taken to ramp up the thread pool? (If you run the same batch of requests multiple times in the same process, do later batches come back quicker?) – Jon Skeet Feb 03 '23 at 17:00