I'm new to Python and I have a basic question but I'm struggling to find an answer online, because a lot of the examples online seem to refer to deprecated APIs, so sorry if this has been asked before.
I'm looking for a way to execute multiple (similar) web requests in parallel, and retrieve the result in a list.
The synchronous version I have right now is something like:
urls = ['http://example1.org', 'http://example2.org', '...']
def getResult(urls):
result = []
for url in urls:
result.append(get(url).json())
return result
I'm looking for the asynchronous equivalent (where all the requests are made in parallel, but I then wait for all of them to be finished before returning the global result).
From what I saw I have to use async/await and aiohttp but the examples seemed way too complicated for the simple task I'm looking for.
Thanks