5

I want to run many HTTP requests in parallel using python. I tried this module named aiohttp with asyncio.

import aiohttp
import asyncio

async def main():
    async with aiohttp.ClientSession() as session:
        for i in range(10):
            async with session.get('https://httpbin.org/get') as response:
                html = await response.text()
                print('done' + str(i))

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

I expect it to execute all the requests in parallel, but they are executed one by one. Although, I later solved this using threading, but I would like to know what's wrong with this?

1 Answers1

16

You need to make the requests in a concurrent manner. Currently, you have a single task defined by main() and so the http requests are run in a serial manner for that task.

You could also consider using asyncio.run() if you are using Python version 3.7+ that abstracts out creation of event loop:

import aiohttp
import asyncio

async def getResponse(session, i):
    async with session.get('https://httpbin.org/get') as response:
        html = await response.text()
        print('done' + str(i))

async def main():
    async with aiohttp.ClientSession() as session:
        tasks = [getResponse(session, i) for i in range(10)] # create list of tasks
        await asyncio.gather(*tasks) # execute them in concurrent manner

asyncio.run(main())
Krishna Chaurasia
  • 8,924
  • 6
  • 22
  • 35