0

I would like to serve the results of memory and cpu intensive aggregations calculated on a domain specific in memory matrix format implemented in python. Loading the data into memory will take some time, so preferably the data will be cached in memory so that successive aggregations are quick. Is there an in-memory python caching solution that would ideally work with flask and celery?

I know I could implement this myself using multiprocessing.Manager for small non-serializable data as described in Store large data or a service connection per Flask session, but my question is whether this is possible using existing functionality of celery or some other package.

andrew
  • 1,843
  • 20
  • 19

1 Answers1

0

As it turns out celery implements an LRU cache that can be easily used to cache large resources to be shared across tasks. I found this example quite useful: https://merin-rose-tom.medium.com/caching-in-celery-using-lrucache-357053251e96

andrew
  • 1,843
  • 20
  • 19