4

I'm using grequests to make about 10,000 calls, but some of these calls return as 503. This problem goes away if I don't queue all 10,000 calls at once. Breaking it into groups of 1000 seems to do the trick. However I was wondering if there's a way to catch this 503 error and just retry the request.

This is how I'm calling and combining the threads:

import grequests
rs = (grequests.get(u, headers=header) for u in urls)
response = grequests.map(rs)

I know this is really vague, but I don't even know if this is possible using grequests.

I naivley tried

import grequests
rs = (grequests.get(u, headers=header) for u in urls)
time.sleep(1)
response = grequests.map(rs)

But this does nothing to slow it down.

Rafael
  • 3,096
  • 1
  • 23
  • 61

2 Answers2

2

Maybe you can try using event hooks to catch the error and re-launch the requests http://docs.python-requests.org/en/master/user/advanced/#event-hooks

 import grequests

 def response_handler(response):
     if response.status_code == '503':
         print('error.503')

 rs = (grequests.get(u, headers=header, hooks = dict('response' : response_handler)) for u in urls)
 response = grequests.map(rs)`
oscomon
  • 21
  • 2
0

You should be able to apply the general methods described at Can I set max_retries for requests.request? to configure requests and urllib3 to do retries, assuming that grequests lets you customize the underlying connections or requests.

nealmcb
  • 12,479
  • 7
  • 66
  • 91