1

My final goal is to have a two-layer cache for every function that I want (probably a self-implemented decorator is needed) I have multiple VMs running the same Django server. The first cache layer is memory, and the second layer is a shared Redis between VMs. The process works as follows, a function is decorated for two-layer caching. In the case of function calls, the server looks for the item inside its in-memory cache. if it couldn't find it, then it will check in the shared Redis.

How can I achieve this?

I already have this code snippet:

from cachetools.func import ttl_cache
from cache_memoize import cache_memoize

@ttl_cache(maxsize=settings.A_NUMBER, ttl=settings.CACHE_TIMEOUT)
@cache_memoize(settings.CACHE_TIMEOUT)
def my_fucn(arg1, arg2):
    some logic here.

Django settings:

CACHES = {
    'default': {
        'BACKEND': 'django_redis.cache.RedisCache',
        'LOCATION': env.str('REDIS_MASTER'),
}

I read this (How to use 2 different cache backends in Django?) but I don't know whether it's possible to use them as a decorator.

Thanks!

Moein
  • 167
  • 3
  • 11

1 Answers1

1

You've added correct methods, but the order of decorator is not correct, so what's happening in your case is it queries Redis first then local cache then function call.

from cachetools.func import ttl_cache
from cache_memoize import cache_memoize


@cache_memoize(settings.CACHE_TIMEOUT)
@ttl_cache(maxsize=settings.A_NUMBER, ttl=settings.CACHE_TIMEOUT)
def my_func(arg1, arg2):
    pass

Now it will check ttl cache first if there's data it will return if not it will call Redis cache method to get a result if Redis cache has data it will return otherwise it will call the function to get the result.

sonus21
  • 5,178
  • 2
  • 23
  • 48