25

How can I properly utilize the asynchronous functionality in a FastAPI route?

The following code snippet takes 10 seconds to complete a call to my /home route, while I expect it to only take 5 seconds.

from fastapi import FastAPI
import time

app = FastAPI()

async def my_func_1():
    """
    my func 1
    """
    print('Func1 started..!!')
    time.sleep(5)
    print('Func1 ended..!!')

    return 'a..!!'

async def my_func_2():
    """
    my func 2
    """
    print('Func2 started..!!')
    time.sleep(5)
    print('Func2 ended..!!')

    return 'b..!!'

@app.get("/home")
async def root():
    """
    my home route
    """
    start = time.time()
    a = await my_func_1()
    b = await my_func_2()
    end = time.time()
    print('It took {} seconds to finish execution.'.format(round(end-start)))

    return {
        'a': a,
        'b': b
    }

I am getting the following result, which looks non asynchronous:

λ uvicorn fapi_test:app --reload
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [5116]
INFO:     Started server process [7780]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:51862 - "GET / HTTP/1.1" 404
Func1 started..!!
Func1 ended..!!
Func2 started..!!
Func2 ended..!!
It took 10 seconds to finish execution.
INFO:     127.0.0.1:51868 - "GET /home HTTP/1.1" 200

But, I am expecting FastAPI to print like below:

Func1 started..!!
Func2 started..!!
Func1 ended..!!
Func2 ended..!!
It took 5 seconds to finish execution.

Please correct me if I am doing anything wrong?

Y A Prasad
  • 371
  • 1
  • 3
  • 11

3 Answers3

21

Perhaps a bit late and elaborating from Hedde's response above, here is how your code app looks like. Remember to await when sleeping, and gathering the awaitables - if you don't do it, no matter whether you use time.sleep() or asyncio.sleep() you will not have the two tasks run concurrently.

from fastapi import FastAPI
import time
import asyncio

app = FastAPI()

async def my_func_1():
    """
    my func 1
    """
    print('Func1 started..!!')
    await asyncio.sleep(5)
    print('Func1 ended..!!')

    return 'a..!!'

async def my_func_2():
    """
    my func 2
    """
    print('Func2 started..!!')
    await asyncio.sleep(5)
    print('Func2 ended..!!')

    return 'b..!!'

@app.get("/home")
async def root():
    """
    my home route
    """
    start = time.time()
    futures = [my_func_1(), my_func_2()]
    a,b = await asyncio.gather(*futures)
    end = time.time()
    print('It took {} seconds to finish execution.'.format(round(end-start)))

    return {
        'a': a,
        'b': b
    }

Mattia Paterna
  • 1,268
  • 3
  • 15
  • 31
  • is there any example that we could use without using `asyncio.sleep(5)`? thanks. – fudu Jul 15 '22 at 03:10
  • You could try to have some remote calls to an external API instead. If the first call takes _n_ time to receive a response, and the second call takes _m_ time, then the total waiting time will be `max(n, m)`. – Mattia Paterna Jul 15 '22 at 12:20
  • Alternatively, you could try to change the input argument in `sleep` to see if the behaviour holds, e.g. whenever the longest concurrent task has been executed, `/home` will return. – Mattia Paterna Jul 15 '22 at 12:22
  • @y-a-prasad if the answer is useful, please consider accepting it. – Mattia Paterna Jun 13 '23 at 13:13
18

time.sleep is blocking, you should use asyncio.sleep, there's also .gather and .wait to aggregate jobs. This is well documented within Python and FastAPI.

Hedde van der Heide
  • 21,841
  • 13
  • 71
  • 100
0

Chrome at least, blocks concurrent GET reuqests on the same URL (probably to get a chance to use the chached versin on the next one?)

Testing with one Chrome in Incognito should work, with "def" as well as with "async def".

Robert Verdes
  • 380
  • 3
  • 9