2

Context

I have a Django project, and inside of this project, I have a database model with a field of type models.BinaryField(max_length=50000000, default=b'0')

Problem

When the server is requested to query for this binary data, even though there are never more than 1-2 concurrent requests, I am reaching the ram limits of my server using io.BytesIO, along with stream.seek({index}), and stream.close()(I use the term stream very loosely, as I am simply referring to seeking and closing a stream of bytes here).

Question

Is it possible that I need to implement the way I am storing this binary data differently?

Solutions that I have tried that didn't work

Would splitting it up into chunks of smaller sizes be computationally more efficient for the RAM of the server? When I attempted this, I discovered that chunking up the file into smaller chunks and relating them with a many-to-one relationship was extremely taxing as far as overhead on the database.

djvg
  • 11,722
  • 5
  • 72
  • 103
Ozzie
  • 139
  • 1
  • 3
  • 11
  • 3
    Please *don't* save binary data in the database. It will result in a lot of memory usage (and run out of memory if the stream is larger than the binary blob) and will be inefficient due to the need to escape the binary sequence. For large binary data, the file system should be used. You can use a `FileField` to abstract away that logic. – Willem Van Onsem Nov 18 '21 at 17:55
  • @WillemVanOnsem are you aware of any implementations of this for Django, and Heroku that don't use H2? – Ozzie Nov 18 '21 at 17:58
  • related: https://dba.stackexchange.com/questions/2445/should-binary-files-be-stored-in-the-database, https://stackoverflow.com/q/38120895 – djvg Aug 02 '22 at 13:09

0 Answers0