I'm developing a backend api service using python Falcon. In order to serve each api call, I need to use an object (same object for all requests) which is being refreshed every X hours (let's say 1 hour). Currently this object is saved as pickle in S3. The problem is, that the pickle is quite big (~20 MB) so reading the pickle for every api call seem not efficient. However I'm not sure what is the right approach to keep this object in- memory between different api calls, and how to refresh it every X hours. Because the pickle is relatively big, I don't want to store it locally, and prefer an in-memory shared object approach.
Thx, Oren