2

Concurrency and parallel processing are two different things.

I know that FastAPI supports concurrency. It can handle multiple API requests concurrently using async and await.

What I want to know is, if FastAPI also supports Multiprocessing and Parallel processing of requests or not? If yes then how can I implement parallel processing of requests?

I have searched a lot but everywhere I found about concurrency only. I am new to FastAPI.

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
krishna
  • 53
  • 1
  • 5
  • 1
    Usually you'll ask your ASGI-server of choice to run multiple workers (i.e. uvicorn or gunicorn, etc.); that way you get concurrency and can use async functionality inside a process, and can still run multiple instances in parallel. – MatsLindh Apr 19 '22 at 10:53

2 Answers2

1

When running your app with Uvicorn or Gunicorn, you can specify how many workers/processes you want. In Uvicorn, you just need to pass --workers N as an argument, and in Gunicorn it's pretty much the same, with --workers=N. That will start N processes all receiving requests at the same time.

Tonik
  • 29
  • 2
-2

If you want to handle multiple requests at once, you can define your routes as def instead of async def. FastAPI will handle the requests in separate threads and multiple requests will be handled at a time.

  • Answer seems incorrect – youri Apr 22 '23 at 10:29
  • @youri and why is that ? – Ammar Ahmad Khan Jun 08 '23 at 10:03
  • I think I misread. I can't upvote unless the answer is edited for some reason. Maybe you could add the link found in the banner above, quoting the relevant part as a summary? This one: [FastAPI runs api-calls in serial instead of parallel fashion](https://stackoverflow.com/questions/71516140/fastapi-runs-api-calls-in-serial-instead-of-parallel-fashion) – youri Jul 25 '23 at 08:05