2

I have an object with a large number of recursive/iterative methods. I'd like to cache the method responses to speed up calls. The four ways I've found of doing this are:

  1. Use built-in lru_cache. This creates a shared cache at the class level. Benefits are that it is fast, easy to implement, and other people are familiar with it. Main issue is that I don't see a way to clear caches associated with a specific instance, so if there are many instances the cache will become quite large which makes this method unusable in this case.
  2. Create my own version of caching at the class level that allows cached items for a specific instance to be purged when it is destroyed, like in this response here.
  3. Store results at the instance level (either with lru_cache as shown in the solution here or with descriptors & instance dictionary).
  4. Store results at the method level by returning a new instance of a callable that holds the method and the cache on the first __get__ and assigning it to the instance with the method name.

Example:

import types

class Wrapper:
    
    def __init__(self, func):
        self._values = {}
        self.func = func
    
    def __call__(self, *args):
        key = tuple(args)
        if key not in self._values:
            self._values[key] = self.func(*args)
        return self._values[key]

class Memoizer:
    
    def __init__(self, func):
        self.func = func
    
    def __set_name__(self, owner, name):
        self.name = name
        
    def __get__(self, instance, cls):
        bound_func = Wrapper(types.MethodType(self.func, instance))
        setattr(instance, self.name, bound_func)
        return bound_func

class MyClass:
    
    @Memoizer
    def my_func(self, x, y):
        return  x + y

I'd prefer not to reinvent the wheel. The lru_cache version of #3 seems like the most straight forward.

Is there a convention for caching methods and having those cached results purged when the instance is deleted?

joudan
  • 81
  • 5

0 Answers0