2

I have a function that sends 2 different requests. I need to call this function with different parameters 20 times.

I would like to run the functions concurrently (different arguments) to spare some time between request and response.

This is a very simplified function:

async def get_data(url):
    return requests.get(url)

And this is how I call it:

loop = asyncio.get_event_loop()
tasks = [asyncio.ensure_future(get_data(url)) for url in websites.split('\n')]
group = asyncio.gather(*tasks)
results = loop.run_until_complete(group)
print(results)
loop.close()

The problem is that it runs sequentially instead of concurrently.

It's obvious that I'm missing something. Do you know what to do?

Milano
  • 18,048
  • 37
  • 153
  • 353
  • 1
    Your code is going to run sequentially because requests isn't asyncio-aware. You'll want to use a library like aiohttp so that additional requests can be made while you're awaiting others. – dirn Oct 09 '20 at 19:03
  • Answered here https://stackoverflow.com/a/63881674/13782669 – alex_noname Oct 09 '20 at 20:55

1 Answers1

3

Don't wrap coroutine in asyncio.create_task, use * to unpack coroutins when passing to asyncio.gather and call loop.run_until_complete in the end

loop = asyncio.get_event_loop()
tasks = [get_data(url) for url in websites.split('\n')]
group = asyncio.gather(*tasks)
results = loop.run_until_complete(group)
print(results)
loop.close()

Plus, it won't be concurrent because requests isn't asynchronous, it's blocking the thread, you need to use alternative async HTTP clients like aiohttp.

GProst
  • 9,229
  • 3
  • 25
  • 47