0

I would like to serve the results of memory and cpu intensive aggregations calculated on a domain specific in memory matrix format implemented in python. Loading the data into memory will take some time, so preferably the data will be cached in memory so that successive aggregations are quick. Is there an in-memory python caching solution that would ideally work with flask and celery?

I know I could implement this myself using multiprocessing.Manager for small non-serializable data as described in Store large data or a service connection per Flask session, but my question is whether this is possible using existing functionality of celery or some other package.

andrew
  • 1,843
  • 20
  • 19
  • No, Celery does not provide this functionality. We use Redis for this purpose. – DejanLekic Jul 13 '22 at 10:15
  • 1
    I think you are mistaken, Celery has implemented LRUCache for this purposes. Couldnt answer this question so i just reposted here: https://stackoverflow.com/questions/72997899/caching-for-large-data-queried-via-flask-and-celery/72997900#72997900 – andrew Jul 15 '22 at 17:47
  • 1
    celery.utils.LRUCache is actually kombu.utils.LRUCache - I briefly looked at the code and I am fairly certain that cache is not distributed. So it worked for the author because he had one single worker... Also, if you need local (single process) caching then it is fine, but then, I would recommend cachetools instead. – DejanLekic Jul 15 '22 at 18:19

0 Answers0