0

For a python application including the following modules:

commons.py

    @lru_cache(maxsize=2)
    def memoized_f(x):
        ...

pipeline_a.py

from commons import memoized_f

x = memoized_f(10)
y = memoized_f(11)

pipeline_b.py

from commons import memoized_f

x = memoized_f(20)
y = memoized_f(21)
  1. does python store one memoized_f cache per pipeline_* module? so in the example above, there will be two caches, total for memoized_f? or
  2. because the caching is defined for memoized_f, there is only cache stored here for memoized_f in the application containing all the modules above?
Bi Act
  • 334
  • 2
  • 3
  • 14
  • You could test this yourself by putting a `time.sleep` in the `memoized_f` and then check whether the function (= sleep) or the cache (= instant) gets queried. – xjcl Jun 01 '20 at 19:47

1 Answers1

0

@functools.lru_cache doesn't do any magic. It is a function decorator, meaning it takes the annotated function (here: memoized_f) as input and defines a new function. Essentially it does memoized_f = lru_cache(memoized_f, maxsize=2, ...) (see here)

So your question boils down to whether a (function in a) module imported by two other modules shares state, to which the answer is yes, there is a common cache. This is because each module is only imported once.

See e.g. the official documentation

xjcl
  • 12,848
  • 6
  • 67
  • 89