3

Flexible support for retrying failed http requests in Python is available via the urllib3 and requests modules, as described at Can I set max_retries for requests.request? The default options allow for an exponential backoff approach.

But I'll have lots of clients running in parallel against my server, and I want the clients to help manage the load by backing off at different rates, randomly. An earlier retry_callable parameter to provide an arbitrary callback function when making a Retry object, as described at A look at the new retry behavior in urllib3 | Kevin Burke didn't make it in to the final code. Is there any other way to accomplish what I'm looking for, without resorting to implementing the complexities of retrying myself via the sort of try-except blocks that he is helping us avoid?

nealmcb
  • 12,479
  • 7
  • 66
  • 91

2 Answers2

2

This has been around for some time, but anyway: The most straightforward solution would be to subclass Retry, overloading .get_backoff_time(), e.g.

class RandomisedRetry(Retry):
    def get_backoff_time(self):
        from random import random
        return random() * super().get_backoff_time()

As .get_backoff_time() is part of Retry's public API, I would assume that it should stay around for a while.

Actually, as a side-note, passing a lambda often is a poor person's inheritance, so when considering a lambda, consider instead subclassing.

1

After looking at Retry code, I think the possible workaround might be to use a class with overridden multiplication operators:

class FuncBackoff:
    def __init__(self, func):
        self.func = func

    def __mul__(self, other):
        return self.func() * other

    def __rmul__(self, other):
        return self.__mul__(other)

And use it somehow like Retry(..., backoff_factor=FuncBackoff(lambda: random.uniform(1, 5)), ...)

Obviously, this may become broken if the way of backoff time calculation changes in future.

  • Intriguing. But have you tried this? `backoff_factor` is a `float`, not a function, and the value you pass will linearly affect the entire sequence of retries. To achieve that effect, it would be simpler to just directly call `random.uniform` for the value in the `Retry` call, rather than defining a new class. https://urllib3.readthedocs.io/en/latest/reference/urllib3.util.html – nealmcb Feb 26 '21 at 18:51
  • I haven't really tried this, it's just a solution I have in my mind. In this solution `backoff_factor` is not a function but an instance of `FuncBackoff` class and this instance supports multiplication - the only operation with `backoff_factor` in `Retry` code. It suppose to act like new random number on every retry iteration, which is not the same as passing random number as value in the `Retry` call. – Dmitry A. Shashkin Feb 27 '21 at 20:47
  • But it is only evaluated upon function call. The Retry function would need to accept a function and repeatedly call it internally to get that sort of more dynamic behavior. – nealmcb Feb 27 '21 at 21:25