1

I have a class with a method that I want to cache properly, i.e. that the results are properly cleaned when the object is no longer in use. Example:

import functools
import numpy as np

class foo:
    def __init__(self, dev):
        self.dev = dev

    @functools.cache
    def bar(self, len):
        return np.random.normal(scale=self.dev, size=len)

if __name__ == '__main__':
    for i in range(100000):
        foo = Foo(i)
        _ = foo.bar(1000000)

This creates a memory leak which is hard to discover. How to do this properly? For properties, there is a cached_property, but this does not work for functions with arguments.

olebole
  • 521
  • 4
  • 17

1 Answers1

-1

With just two changes, you can considerably improve your algorithm’s runtime:

  1. Import the @lru_cache decorator from the functools module.
  2. Use @lru_cache to decorate model().

Here’s what the top of the script will look like with the two updates:

import functools
from abc import abstractmethod, ABC


"""
Module: base_dao.py
Author: Imam Hossain Roni
Created: April 01, 2020
Description: 'Its a base dao used to separate the data persistence logic 
in a separate layer.'
"""



class Dao(ABC):
    # must override these
    MODEL_CLASS = None
    SAVE_BATCH_SIZE = 1000
    VALIDATOR_CLASS = None

    @property
    @abstractmethod
    def model_cls(self):
        pass

    @property
    @functools.lru_cache(maxsize=None)
    def model(self):
        return self.model_cls()