I want to make a web page that generates a uniform random number between 0 and 99 every 10 seconds, and displays a list of the 100 most recent numbers (which are the same for everyone visiting the site). It should update live.
My design is the following:
- A long-running Python process (e.g. using supervisord) that runs in an eternal loop, generating numbers at 10-second intervals, and writing the numbers to a file or SQL database, and pruning the old numbers since they are no longer needed.
- Then the web server process simply reads the file and displays to the user (either on initial load, or from an Ajax call to get the most recent numbers)
I don't feel great about this solution. It's pretty heavy on file system I/O, which is not really a bottleneck or anything, but I just wonder if there's a smarter way that is still simple. If I could store the list as an in-memory data structure shared between processes, I could have one process push and pop values every 10 seconds, and then the web server processes could just read that data structure. I read a bit about Unix domain sockets, but it wasn't clear that this was a great fit to my problem
Is there a more efficient approach that is still simple?
EDIT: the approach suggested by Martijn Peters in his answer (don't generate anything until someone visits) is sensible and I am considering it too, since the website doesn't get very heavy traffic. The problem I see is with race conditions, since you then have multiple processes trying to write to the same file/DB. If the values in the file/DB are stale, we need to generate new ones, but one process might read the old values before another process has had the chance to update them. File locking as described in this question is a possibility, but many people in the answers warn about having multiple processes write to the same file.