As a learning exercise, I'm trying to modify the quickstart example of aiohttp to fetch multiple urls with a single ClientSession (the docs suggest that usually one ClientSession should be created per application).
import aiohttp
import asyncio
async def fetch(session, url):
async with session.get(url) as response:
return await response.text()
async def main(url, session):
print(f"Starting '{url}'")
html = await fetch(session, url)
print(f"'{url}' done")
urls = (
"https://python.org",
"https://twitter.com",
"https://tumblr.com",
"https://example.com",
"https://github.com",
)
loop = asyncio.get_event_loop()
session = aiohttp.ClientSession()
loop.run_until_complete(asyncio.gather(
*(loop.create_task(main(url, session)) for url in urls)
))
# session.close() <- this doesn't make a difference
However, creating the ClientSession outside a coroutine is clearly not the way to go:
➜ python 1_async.py 1_async.py:30: UserWarning: Creating a client session outside of coroutine is a very dangerous idea session = aiohttp.ClientSession() Creating a client session outside of coroutine client_session: Starting 'https://python.org' Starting 'https://twitter.com' Starting 'https://tumblr.com' Starting 'https://example.com' Starting 'https://github.com' 'https://twitter.com' done 'https://example.com' done 'https://github.com' done 'https://python.org' done 'https://tumblr.com' done 1_async.py:34: RuntimeWarning: coroutine 'ClientSession.close' was never awaited session.close() Unclosed client session client_session: Unclosed connector connections: ['[(, 15024.110107067)]', '[(, 15024.147785039)]', '[(, 15024.252375415)]', '[(, 15024.292646968)]', '[(, 15024.342368087)]', '[(, 15024.466971983)]', '[(, 15024.602057745)]', '[(, 15024.837045568)]'] connector:
FWIW, this was main
before I attempted the above change:
async def main(url):
async with aiohttp.ClientSession() as session:
print(f"Starting '{url}'")
html = await fetch(session, url)
print(f"'{url}' done")
What would be the correct way to do this? I thought about passing a list of urls to main but couldn't make it work in a non-sequential fashion.