3

I have the following async code that gets called from so many places in my project:

public async Task<HttpResponseMessage> MakeRequestAsync(HttpRequestMessage request)
{            
    var client = new HttpClient();
    return await client.SendAsync(request).ConfigureAwait(false);                
}

An example of how the above method gets called:

 var tasks = items.Select(async i =>
            {                    
                var response = await MakeRequestAsync(i.Url);           
                //do something with response    
            });

The ZenDesk API that I'm hitting allows about 200 requests per minute after which I'm getting a 429 error. I need to do some sort of a Thread.sleep if I encounter the 429 error, but with with async/await, there may be so many requests in parallel threads waiting to process, I am not sure how I can make all of them sleep for 5 seconds or so and then resume again.

What's the correct way to approach this problem? I'd like to hear quick solutions as well as good-design solutions.

Prabhu
  • 12,995
  • 33
  • 127
  • 210
  • 2
    " there may be so many requests in parallel threads" - there are exactly as many request as you've scheduled. Just make sure you schedule as many request as needed and be happy. And please use `Task.Delay` if you really have to. – Alexei Levenkov Jul 30 '14 at 04:34
  • How are you calling this method? Why are there parallel threads? – Yuval Itzchakov Jul 30 '14 at 04:41
  • What I meant by parallel is that the MakeRequestAsync gets called from multiple places, so requests are being made at the same time. – Prabhu Jul 30 '14 at 04:45
  • @AlexeiLevenkov How can I control the number of requests made? Do I need some sort of a queue? I still need to handle the 429 error though because the 200 requests per minute is just a guideline, it may be different at times. – Prabhu Jul 30 '14 at 04:52
  • 1
    A lot of similar questions already answered here, for example: http://stackoverflow.com/a/20904462/2674222. Terms to search for: "throttling", "Dataflow", "SemaphoreSlim". – avo Jul 30 '14 at 05:16
  • Similar question with a nice answer: http://stackoverflow.com/questions/10257312/how-to-limit-number-of-httpwebrequest-per-second-towards-a-webserver – Martin Liversage Jul 30 '14 at 09:08

2 Answers2

6

I do not think that this is a duplicate, as marked recently. The other SO poster does not need a time-based sliding window (or time based throttling) and the answer there does not cover this situation. That works only when you want to set a hard limit on outgoing requests.

Anyway, a quasi-quick solution is to make the throttling in the MakeRequestAsync method. Something like this:

public async Task<HttpResponseMessage> MakeRequestAsync(HttpRequestMessage request)
{            
    //Wait while the limit has been reached. 
    while(!_throttlingHelper.RequestAllowed) 
    {
      await Task.Delay(1000);
    }

    var client = new HttpClient();

    _throttlingHelper.StartRequest();
    var result = await client.SendAsync(request).ConfigureAwait(false);
    _throttlingHelper.EndRequest();

    return result; 
}

The class ThrottlingHelper is just something I made now so you may need to debug it a bit (read - may not work out of the box). It tries to be a timestamp sliding window.

public class ThrottlingHelper : IDisposable
{
    //Holds time stamps for all started requests
    private readonly List<long> _requestsTx;
    private readonly ReaderWriterLockSlim _lock;

    private readonly int _maxLimit;
    private TimeSpan _interval;

    public ThrottlingHelper(int maxLimit, TimeSpan interval)
    {
        _requestsTx = new List<long>();
        _maxLimit = maxLimit;
        _interval = interval;
        _lock = new ReaderWriterLockSlim(LockRecursionPolicy.NoRecursion);
    }

    public bool RequestAllowed
    {
        get
        {
            _lock.EnterReadLock();
            try
            {
                var nowTx = DateTime.Now.Ticks;
                return _requestsTx.Count(tx => nowTx - tx < _interval.Ticks) < _maxLimit;
            }
            finally
            {
                _lock.ExitReadLock();
            }
        }
    }

    public void StartRequest()
    {
        _lock.EnterWriteLock();
        try
        {
            _requestsTx.Add(DateTime.Now.Ticks);
        }
        finally
        {
            _lock.ExitWriteLock();
        }
    }

    public void EndRequest()
    {
        _lock.EnterWriteLock();
        try
        {
            var nowTx = DateTime.Now.Ticks;
            _requestsTx.RemoveAll(tx => nowTx - tx >= _interval.Ticks);
        }
        finally
        {
            _lock.ExitWriteLock();
        }
    }

    public void Dispose()
    {
        _lock.Dispose();
    }
}

You would use it as a member in the class that makes the requests, and instantiate it like this:

_throttlingHelper = new ThrottlingHelper(200, TimeSpan.FromMinutes(1));

Don't forget to dispose it when you're done with it.

A bit of documentation about ThrottlingHelper:

  1. Constructor params are the maximum requests you want to be able to do in a certain interval and the interval itself as a time span. So, 200 and 1 minute means that that you want no more than 200 requests/minute.
  2. Property RequestAllowed lets you know if you are able to do a request with the current throttling settings.
  3. Methods StartRequest & EndRequest register/unregister a request by using the current date/time.

EDIT/Pitfalls

As indicated by @PhilipABarnes, EndRequest can potentially remove requests that are still in progress. As far as I can see, this can happen in two situations:

  1. The interval is small, such that requests do not get to complete in good time.
  2. Requests actually take more than the interval to execute.

The proposed solution involves actually matching EndRequest calls to StartRequest calls by means of a GUID or something similar.

Marcel N.
  • 13,726
  • 5
  • 47
  • 72
  • great let me give this a shot and report back. Is it also possible to use this along with batching up requests to start with, so that only 20 or so requests go out at a time? I have been trying to implement the solution suggested here: http://stackoverflow.com/a/20904462/406322 – Prabhu Jul 30 '14 at 08:15
  • @Prabhu: Yes, you can adapt this to also allow a certain number of concurrent requests by combining this solution and the one in the other post. – Marcel N. Jul 30 '14 at 08:17
  • @marcelyn So would I just put all of the code in MakeRequestAsync inside the "try" block of the other solution? – Prabhu Jul 30 '14 at 08:21
  • 1
    @Prabhu: Yes, that would be the quickest solution you can get (I think). If you want something nicer, you need to combine the two more seamlessly. – Marcel N. Jul 30 '14 at 08:22
  • while I implement this I'm trying to understand the code. What is maxLimit and what is interval? – Prabhu Jul 30 '14 at 08:27
  • awesome! So I know I can use throttlinghelper to control the number of requests going out. But, in case I still end up getting a 429, is there anyway I can use throttlinghelper to pause for 30 seconds and then retry again? – Prabhu Jul 30 '14 at 08:38
  • @Prabhu: If you still get a 429 then that means the service allows less than 200 requests per minute or maybe that you made too many concurrent requests. I think you can just increase the `Thread.Delay` value to more than 1s, which is what it is now. – Marcel N. Jul 30 '14 at 08:40
  • the throttlehelper should make sure that I have a max of 200 concurrent requests at any time right? – Prabhu Jul 30 '14 at 08:48
  • @Prabhu: Yes and no. At any time, but within the last 60 seconds (or whatever you pass as interval). So, it makes sure that in 60 seconds you don't make more than 200 requests, otherwise it makes you wait until some more time passes. – Marcel N. Jul 30 '14 at 08:50
  • it looks like the throttlehelper is working. Thanks for the help! I just need to figure out a way to handle the 429 in the unlikely chance that I still get it (even after tweaking the parameters). – Prabhu Jul 30 '14 at 09:04
  • 2
    _requestsTx.RemoveAll(tx => nowTx - tx >= _interval.Ticks) could potentially remove requests which are still running? Is it better to key value and remove based on a key such as GUID? – InContext Jul 30 '14 at 09:11
  • @PhilipABarnes: Yes, that could happen if the interval is too small or if there are really requests that take more than 60 seconds to complete. I'll update the answer with this potential pitfall. – Marcel N. Jul 30 '14 at 09:14
  • @marcelyn I would need to ensure that there is only one instance of throttlehelper running right? Do I need to make it static? – Prabhu Jul 30 '14 at 09:42
  • @Prabhu: yes, only one instance. – Marcel N. Jul 30 '14 at 09:43
  • @Prabhu: Sure, if that's what you need. You only need to guard the instance, as the other methods are thread safe (via the ReadWriteLockSlim). – Marcel N. Jul 30 '14 at 10:03
  • @marceln would you be able to modify it with the Guid solution. I've been cracking my head with it. Thanks – Prabhu Jul 30 '14 at 12:40
  • @MarcelN. I think I uncovered a potential issue with this code. The very first time the program runs, more requests than the max allowed would be allowed in. _throttlingHelper.StartRequest() should really go above the while loop, right? – Prabhu Aug 20 '14 at 04:56
0

if there are multiple requests waiting in the while loop for RequestAllowed some of them might start at the same time. how about a simple StartRequestIfAllowed?

public class ThrottlingHelper : DisposeBase
{
    //Holds time stamps for all started requests
    private readonly List<long> _requestsTx;
    private readonly Mutex _mutex = new Mutex();

    private readonly int _maxLimit;
    private readonly TimeSpan _interval;

    public ThrottlingHelper(int maxLimit, TimeSpan interval)
    {
        _requestsTx = new List<long>();
        _maxLimit = maxLimit;
        _interval = interval;
    }


    public bool StartRequestIfAllowed
    {
        get
        {
            _mutex.WaitOne();
            try
            {
                var nowTx = DateTime.Now.Ticks;
                if (_requestsTx.Count(tx => nowTx - tx < _interval.Ticks) < _maxLimit)
                {
                    _requestsTx.Add(DateTime.Now.Ticks);
                    return true;
                }
                else
                {
                    return false;
                }
            }
            finally
            {
                _mutex.ReleaseMutex();
            }
        }
    }

    public void EndRequest()
    {
        _mutex.WaitOne();
        try
        {
            var nowTx = DateTime.Now.Ticks;
            _requestsTx.RemoveAll(tx => nowTx - tx >= _interval.Ticks);
        }
        finally
        {
            _mutex.ReleaseMutex();
        }
    }

    protected override void DisposeResources()
    {
        _mutex.Dispose();
    }
}
Tony
  • 135
  • 9