2

I have a resource intensive async method that i want to run as a background task. Example code for it looks like this:

@staticmethod
async def trigger_task(id: str, run_e2e: bool = False):
    try:
        add_status_for_task(id)
        result1, result2 = await task(id)
        update_status_for_task(id, result1, result2)
    except Exception:
        update_status_for_task(id, 'FAIL')


@router.post("/task")
async def trigger_task(background_tasks: BackgroundTasks):
    background_tasks.add_task(EventsTrigger.trigger_task)
    return {'msg': 'Task submitted!'}

When i trigger this endpoint, I expect an instant output: {'msg': 'Task submitted!'}. But instead the api output is awaited till the task completes. I am following this documentation from fastapi.

fastapi: v0.70.0 python: v3.8.10

I believe the issue is similar to what is described here. Request help in making this a non-blocking call.

udit
  • 101
  • 3
  • 17
  • Does this answer your question? [fastapi asynchronous background tasks blocks other requests?](https://stackoverflow.com/questions/67599119/fastapi-asynchronous-background-tasks-blocks-other-requests) – mihi Dec 02 '21 at 11:15
  • Does this answer your question? [FastAPI runs api-calls in serial instead of parallel fashion](https://stackoverflow.com/questions/71516140/fastapi-runs-api-calls-in-serial-instead-of-parallel-fashion) – Chris Mar 10 '23 at 17:57

3 Answers3

0

What I have learned from the github issues,

  • You can't use async def for task functions (Which will run in background.)
  • As in background process you can't access the coroutine, So, your async/await will not work.
  • You can still try without async/await. If that also doesn't work then you should go for alternative.

Alternative Background Solution

  • Celery is production ready task scheduler. So, you can easily configure and run the background task using your_task_function.delay(*args, **kwargs)
  • Note that, Celery also doesn't support async in background task. So, whatever you need to write is sync code to run in background.

Good Luck :)

0

Unfortunately you seem to have oversimplified your example so it is a little hard to tell what is going wrong.

But the important question is: are add_status_for_task() or update_status_for_task() blocking? Because if they are (and it seems like that is the case), then obviously you're going to have issues. When you run code with async/await all the code inside of it needs to be async as well.

This would make your code look more like:

async def trigger_task(id: str, run_e2e: bool = False):
    try:
        await add_status_for_task(id)
        result1, result2 = await task(id)
        await update_status_for_task(id, result1, result2)
    except Exception:
        await update_status_for_task(id, 'FAIL')


@router.post("/task/{task_id}")
async def trigger_task(task_id: str, background_tasks: BackgroundTasks):
    background_tasks.add_task(EventsTrigger.trigger_task, task_id)
    return {'msg': 'Task submitted!'}
acnebs
  • 218
  • 3
  • 9
-2

How are you running your app?

According to the uvicorn docs its running with 1 worker by default, which means only one process will be issued simultaneously. Try configuring your uvicorn to run with more workers. https://www.uvicorn.org/deployment/

$ uvicorn example:app --port 5000 --workers THE_AMOUNT_OF_WORKERS
or
uvicorn.run("example:app", host="127.0.0.1", port=5000, workers=THE_AMOUNT_OF_WORKERS)
  • This does not seems to be a correct approach because with this, if i run 4 workers and have 20 users, then at a time only 4 of them will be able to run their task because this was a blocking call. – udit Nov 29 '21 at 04:27
  • It should handle it right if you are using async await in python right. Check that the thing u do support async work. I would recommend u to think about using python celery, my team use it to run async tasks for a large number of users. Depends on your amount of users and on the amount of time required to complete the task i would recommend scaling your service. – Yair Siman Tov Dec 01 '21 at 15:23