I have a coroutine sending web requests and post process them. Currently I'm doing:
async def scrape(url, sess, logging=None):
# request
result = sess.get(url, headers=headers(url))
# process
if result.ok:
await(post_process(result.content))
async def main():
# code here
for url in urls:
await asyncio.create_task(scrape(url, sess))
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
The problem is it is running slowly! Seems the requests are blocking the loop. How can I turn requests to coroutines and wait for it to complete?
Note: I did google it. But I haven't find a concise example to do this. Also, I am using requests
and some searches show it's causing the trouble.