1

I'd like to embed some async code in my Python Project to make the http request part be asyncable . for example, I read params from Kafka, use this params to generate some urls and put the urls into a list. if the length of the list is greater than 1000, then I send this list to aiohttp to batch get the response.

I can not change the whole project from sync to async, so I could only change the http request part.

the code example is:


async def async_request(url):
    async with aiohttp.ClientSession() as client:
        resp = await client.get(url)
        result = await resp.json()
        return result

async def do_batch_request(url_list, result):
    task_list = []
    for url in url_list:
        task = asyncio.create_task(async_request(url))
        task_list.append(task)
    batch_response = asyncio.gather(*task_list)
    result.extend(batch_response)

def batch_request(url_list):
    batch_response = []
    asyncio.run(do_batch_request(url_list, batch_response))
    return batch_response

url_list = []
for msg in kafka_consumer:
    url = msg['url']
    url_list.append(url)
    if len(url_list) >= 1000:
        batch_response = batch_request(url_list)
        parse(batch_response)
        ....

As we know, asyncio.run will create an even loop to run the async function and then close the even loop. My problem is that, will my method influence the performance of the async code? And do you have some better way for my situation?

Kingname
  • 1,302
  • 11
  • 25

1 Answers1

0

There's no serious problem with your approach and you'll get speed benefit from asyncio. Only possible problem here is that if later you'll want to do something async in other place in the code you'll not be able to do it concurrently with batch_request.

There's not much to do if you don't want to change the whole project from sync to async, but if in the future you'll want to run batch_request in parallel with something, keep in mind that you can run it in thread and wait for result asynchronously.

Mikhail Gerasimov
  • 36,989
  • 16
  • 116
  • 159