5

I'm hitting an api that will return a 429, too many requests, response if you hit it more than 250 times in a five minute span. The count resets every five minutes so I've been handling like this:

try
{
    return request.GetResponse();  
}
catch (Exception e)
{
    if (e.Message.Contains("429"))
    {
        System.Threading.Thread.Sleep(5 * 60 * 1000);
        return request.GetResponse();
    }
    else
    {
        throw new Exception(e.Message);
    }
}

Is this the correct way to handle this situation?

sideshowbarker
  • 81,827
  • 26
  • 193
  • 197
ye-olde-dev
  • 1,168
  • 2
  • 17
  • 29
  • 1
    seems very error prone. why are you calling it that frequently? can you get more data with a single request? – Daniel A. White Dec 26 '17 at 20:01
  • This feels like logic using exceptions which is almost never the best option. `try..catch` is pretty heavy for what might be a conditional instead. – Mark Schultheiss Dec 26 '17 at 20:24
  • @DanielA.White , This is pretty rare that it get's called this frequently. about once a quarter when data is verified. unfortunately, during those times, it needs to be called a lot in one day and I don't have control over the rest service to expand upon the data returned. – ye-olde-dev Dec 26 '17 at 21:43

2 Answers2

6

One thing to consider here is that the api you are consuming might "penalize" you if you start going over the limit. I think it's best to be proactive and throttle your api requests, so you never receive the 429 errors.

We started experiencing the same thing on our app (429 errors) from an api we are consuming. The limit on this particular api was 10 every 60 seconds. We implemented a throttle() function that utilizes a memory cache with the date/time embedded. The api we used also tracked usage on an account basis. You may not need that. But this is a snippet we used to throttle our requests to ensure we were always under the limit:

    private void throttle()
    {
        var maxPerPeriod = 250;
        //If you utilize multiple accounts, you can throttle per account. If not, don't use this:
        var keyPrefix = "a_unique_id_for_the_basis_of_throttling";
        var intervalPeriod = 300000;//5 minutes
        var sleepInterval = 5000;//period to "sleep" before trying again (if the limits have been reached)
        var recentTransactions = MemoryCache.Default.Count(x => x.Key.StartsWith(keyPrefix));
        while (recentTransactions >= maxPerPeriod)
        {
            System.Threading.Thread.Sleep(sleepInterval);
            recentTransactions = MemoryCache.Default.Count(x => x.Key.StartsWith(keyPrefix));
        }
        var key = keyPrefix + "_" + DateTime.Now.ToUniversalTime().ToString("yyyyMMddHHmm");
        var existing = MemoryCache.Default.Where(x => x.Key.StartsWith(key));
        if (existing != null && existing.Any())
        {
            var counter = 2;
            var last = existing.OrderBy(x => x.Key).Last();
            var pieces = last.Key.Split('_');
            if (pieces.Count() > 2)
            {
                var lastCount = 0;
                if (int.TryParse(pieces[2], out lastCount))
                {
                    counter = lastCount + 1;
                }
            }
            key = key + "_" + counter;
        }
        var policy = new CacheItemPolicy
        {
            AbsoluteExpiration = DateTimeOffset.UtcNow.AddMilliseconds(intervalPeriod)
        };
        MemoryCache.Default.Set(key, 1, policy);
    }

We used this throttle() function in our code like this:

    public override void DoAction()
    {
        throttle();
        var url = ContentUri;
        var request = (HttpWebRequest)WebRequest.Create(url);
        request.Method = "GET";
        request.Headers.Add("Authorization", "Bearer " + AccessToken);
        request.Accept = "application/json";
        WebResponse response = request.GetResponse();
        var dataStream = new MemoryStream();
        using (Stream responseStream = request.GetResponse().GetResponseStream())
        {
            //DO STUFF WITH THE DOWNLOADED DATA HERE...
        }
        dataStream.Close();
        response.Close();
    }

It essentially keeps track of your requests in the cache. If the limit has been reached, it pauses until enough time has passed so that you are still under the limit.

Matt Spinks
  • 6,380
  • 3
  • 28
  • 47
  • Thanks Matt, that's good advice to pause before throwing the error. I'm going to try and implement something similar to this. – ye-olde-dev Dec 26 '17 at 21:50
  • Thanks again Matt, implemented a similar approach (without the account specific data) in my code and worked like a charm. Essentially used a combinatino of checking a call count and UTC timestamps. – ye-olde-dev Dec 27 '17 at 18:01
2

You should not be doing string parsing on a exception class. In turn Exception classes should never put important information into the message field.

Your entire catching and rethrowing of the exception is also wrong. So on that front you should read up on proper exception handling. Here are two articles I link often:

https://learn.microsoft.com/en-us/archive/blogs/ericlippert/vexing-exceptions http://www.codeproject.com/Articles/9538/Exception-Handling-Best-Practices-in-NET

Unless I got the wrong Network class, I think you should catch WebExceptions only. I think that is how you are supposed to get the HTTP Error code out of this function, but I am not really sure.

Better would be to avoid the exception altogether. So I have to ask: Why are you even calling that function that often in a row? Is there no proper function for bulk retrieval? Is the service even intended to be automated like this?

johncgl
  • 3
  • 2
Christopher
  • 9,634
  • 2
  • 17
  • 31
  • In truth, the service is not meant to be used this way. but, until the service is updated, the business rules require that we do. I will definitely read up a bit more on exception handling. Admittedly this is one of my weaker areas which is why I was asking for advice. Thanks! – ye-olde-dev Dec 26 '17 at 21:46
  • Then you need to throtle you access to it, so the 250 calls in 5 mintues is enough. Maybe you could get somewhat around the limit, by putting your own service that caches results between the client and original server? – Christopher Dec 26 '17 at 21:53
  • I'm actually in the process of building this out as we speak and this is more of a band-aid approach for the short term to keep the lights on. – ye-olde-dev Dec 26 '17 at 22:02