0

I am working on a FastAPI application where I need to handle large file uploads. I want to be able to read the file data as a stream and process it in chunks, without using temporary files or reading the entire file into memory. Is there a way to achieve this in FastAPI?

I have a FastAPI endpoint that accepts file uploads using the multipart/form-data format. I want to be able to access the file data as a stream and read it in chunks, rather than loading the entire file into memory. This is important for handling large files efficiently and preventing memory issues.

I have looked into using UploadFile from fastapi.UploadFile, but it seems to read the entire file into memory/disk using tempfile.SpooledTemporaryFile. I would like to avoid this behavior and read the file as stream direct from user http request.

Is there a way to treat the file data as a stream or use a streaming approach in FastAPI to handle large file uploads? If achieving this behavior with FastAPI is not possible, I am willing to consider other frameworks. Any examples or suggestions would be greatly appreciated.

  • 1
    Please have a look at [this answer](https://stackoverflow.com/a/73443824/17865804) as well. – Chris Jun 11 '23 at 09:14
  • In essence, there is no other way to read the request body stream in chunks and parse it except by doing it manually yourself or using a framework that provides such functionality? – LtGenFlower Jun 12 '23 at 08:26

0 Answers0