1

In FastAPI, I have this route:

for id in ids:  #get projects from list of ids
  p = await gitlab.project(id)
  if p and 'error' not in p:
     projects[int(id)] = p

But it takes around 2sec per request sequentially, so I wait more than a minute.

How can I do this in parallel using like 10 threads from a thread-pool the easiest way without having to manipulate the gitlap.project(id) method

gitlab.py has a global httpx.AsyncClient()

I tried sending the ids directly: res = await gitlab.projects(ids)

but is still do all sequentially.

Below are the two functions in gitlab.py:

async def project(id:str):
    """" return meta data for project """
    global s
    url = config.get_config()['gitlaburl'] + f"/{id}"
    r = await s.get(url, headers={'PRIVATE-TOKEN': config.get_config()['gitlabtoken']})
    if r.status_code!=200:
        return {"error": f"unable to fetch from gitlab: {url}:{r.status_code} -> {r.reason}"}
    out = {}
    out['id'] = int(id)
    dat = json.loads(r.text)
    for k,v in dat.items():
        if k in "description,name,path_with_namespace".split(','):
            out[k] = v
        if k=='namespace' and 'avatar_url' in v:
            out['avatar_url'] = v['avatar_url']
    return out

async def projects(ids:List[Union[str,int]]):
    """ array of projects from config projectids """
    dat = []
    for id in ids:
        dat.append(await project(id))
    return dat
Chris
  • 18,724
  • 6
  • 46
  • 80
MortenB
  • 2,749
  • 1
  • 31
  • 35
  • 1
    In addition to the link above, you might find [this answer](https://stackoverflow.com/a/74239367/17865804) and [this answer](https://stackoverflow.com/a/73736138/17865804) helpful as well. – Chris Mar 10 '23 at 18:30

1 Answers1

2

You can do this in pure asyncio, no need for a threadpool. I would use asyncio.as_completed, which runs several coroutines at the same time:

tasks = [gitlab.project(id_) for id_ in ids]
for t in asyncio.as_completed(tasks):
    p = await asyncio.gather(t)
    # process the response p list
MortenB
  • 2,749
  • 1
  • 31
  • 35
M.O.
  • 1,712
  • 1
  • 6
  • 18
  • Perfect direct answer. This worked fine. my job now works in 5seconds. down from 60seconds. just needed to add the asynchio.gather to demangle the coroutine task – MortenB Mar 13 '23 at 10:31
  • great answer, helped me to decrease processing time, thanks! – MiFi Jun 02 '23 at 13:04