Flexible support for retrying failed http requests in Python is available via the urllib3 and requests modules, as described at Can I set max_retries for requests.request? The default options allow for an exponential backoff approach.
But I'll have lots of clients running in parallel against my server, and I want the clients to help manage the load by backing off at different rates, randomly. An earlier retry_callable
parameter to provide an arbitrary callback function when making a Retry
object, as described at A look at the new retry behavior in urllib3 | Kevin Burke didn't make it in to the final code. Is there any other way to accomplish what I'm looking for, without resorting to implementing the complexities of retrying myself via the sort of try-except
blocks that he is helping us avoid?