0

I would like to allow my users to upload large files <1GB to my database. I am using a database since storing raw files can be dangerous and I would like to have a single source of state in my system since its meant to be serverless.
Now the VPS am I am planning to run it on has limited ram. And multiple users should of course be able to upload simultaneously.
So in order to not exceed this ram, I would need to either

  • stream the image into the database as it is being uploaded from user
  • or I would need to first stream it into a file using something like multer and then stream it from the file into PostgreSQL as a BLOB

So is there a way to do this using pg-promise? Stream a file into the database without ever loading the whole thing into ram?

user2741831
  • 2,120
  • 2
  • 22
  • 43
  • 1
    There is no such thing as streaming into an individual `BLOB`, not in `pg-promise`, not in any other library, because PostgreSQL doesn't have such feature. You are effectively misusing the database for what it is. You just do not put large files into a PostgreSQL itself, that's the wrong thinking. – vitaly-t Jul 15 '21 at 13:01
  • 1
    See also [this question](https://stackoverflow.com/questions/9605922/are-there-performance-issues-storing-files-in-postgresql). – vitaly-t Jul 15 '21 at 13:13
  • I mean fundamentally a database is for holding data. I don't understand why everyone seams to think that larger units of data belong into a different storage system. Its just one more thing to manage. But I guess I'll just stick with a fs solution for now, thanks – user2741831 Jul 15 '21 at 15:07
  • 1
    Only an external file storage will let you to actually stream files in. So it will scale a heck better. – vitaly-t Jul 16 '21 at 00:52

0 Answers0