0

I want to parse a lot of links using iohttp, but it doesn't work in real time, parse each one in turn. How do I make it so that I parse each one continuously?

import aiohttp
import asyncio


list = ['value1', 'value2', 'value3', ..... 'value6000']

async def main():
    async with aiohttp.ClientSession() as session:
        while True:
            for i in list:
                async with session.get(f"https://example.com/{i}") as response:
                    print(response.json())

loop = asyncio.get_event_loop()
loop.run_until_complete(main())
ReD
  • 53
  • 6
  • It's not what your issue is, but don't overwrite builtins such as `list` unless you know what your doing. It will cause issues down the road. – Jab Sep 23 '21 at 20:39

1 Answers1

0

If you are looking for processing a list, you should check asyncio.gather().

Example of usage:

import asyncio
import os

from aiohttp import ClientSession, ClientError, TCPConnector
from asyncio import TimeoutError, gather



async def asyncmain(your_list):

    async with ClientSession(
        connector=TCPConnector(limit_per_host=SESSION_LIMIT_PER_HOST)
    ) as session:
        tasks = (your_async_call(session, title) for title in your_list)

        return await gather(*tasks)


Leemosh
  • 883
  • 6
  • 19