I am using two dynos (web and worker). web handles requests using a flask app. worker runs basic python code which outputs a .csv file every 5 minutes. This file is quite small (<1MB) .The flask app is supposed to read this .csv file which is required to serve the requests. Question is : what is the most efficient way to do this ? From what I understand dynos are isolated from each other. Secondly, heroku has an ephemeral filesystem (which is ok for my application) because the .csv files need not persist between restarts. Also they need not be backed up. The .csv file written out by the worker is not visible in ls
output (after doing heroku run bash
). This is most probably because the web and worker are isolated from each other. After spending some time researching the options online, I think there are three options :
(1) Use AWS S3 : Is this a good option for active files ? The files are supposed to be written and read at high frequecy - would'nt S3 be slow for my application. Secondly I want to write (and read back) from within the python code. I am not sure how to do this with S3. It seems that S3 is storage for static files that your application needs.
(2) Postgres : Does the worker and web share this storage ? Can one write a .csv file to this or it has to be SQL ?
(3) Redis : I do not understand this. It is something which uses queues to communicate between the dynos. Can it be used to communicate data ?
My flask application has to read the file only when a request comes in from the front end. Not sure how a Reddis queue can help with this.
Thanks.