The problem is really two-part
- the requests library is synchronous, so
requests.post(...)
will block the event loop until completed
- you don't need the result of the web request to respond to the client, but your current handler cannot respond to the client until the request is completed (even if it was async)
Consider separating the request logic off into another process, so it can happen at its own speed.
The key being that you can put work into a queue of some kind to complete eventually, without directly needing the result for response to the client.
You could use an async http request library and some collection of callbacks, multiprocessing
to spawn a new process(es), or something more exotic like an independent program (perhaps with a pipe or sockets to communicate).
Maybe something of this form will work for you
import base64
import json
import multiprocessing
URL_EXTERNAL_SERVICE = "https://example.com"
TIMEOUT_REQUESTS = (2, 10) # always set a timeout for requests
SHARED_QUEUE = multiprocessing.Queue() # may leak as unbounded
async def slow_request(data):
SHARED_QUEUE.put(data)
# now returns on successful queue put, rather than request completion
def requesting_loop(logger, Q, url, token):
while True: # expects to be a daemon
data = json.dumps(Q.get()) # block until retrieval (non-daemon can use sentinel here)
response = requests.post(
url,
json=data,
headers={'Auth-Header': token},
timeout=TIMEOUT_REQUESTS,
)
# raise_for_status() --> try/except --> log + continue
if response.status_code != 200:
logger.error('call to {} failed (code={}) with data: {}'.format(
url, response.status_code,
"base64:" + base64.b64encode(data.encode())
))
def startup(): # run me when starting
# do whatever is needed for logger
# create a pool instead if you may need to process a lot of requests
p = multiprocessing.Process(
target=requesting_loop,
kwargs={"logger": logger, "Q": SHARED_QUEUE, "url": URL_EXTERNAL_SERVICE, "token": settings.API_TOKEN},
daemon=True
)
p.start()