2

I wanted to calculate a recursive function using lru_cache. Here is a simplified version of it:

from functools import lru_cache

@lru_cache(maxsize=None)
def f(n:int)->int:
    if (n==0): return 1
    return n+f(n-1)

### MAIN ###
print(f(1000))

It works well when I run f(100), but with f(1000) I get:

RecursionError: maximum recursion depth exceeded in comparison

One solution is to calculate a table of values for f myself. Is there a solution that does not require me to manually create a table of values?

Erel Segal-Halevi
  • 33,955
  • 36
  • 114
  • 183
  • this has nothing to do with `lru_cache`. Python simply limits recursion to a depth of 1000 by default. –  Jul 24 '18 at 18:36
  • Possible duplicate of [What is the maximum recursion depth in Python, and how to increase it?](https://stackoverflow.com/questions/3323001/what-is-the-maximum-recursion-depth-in-python-and-how-to-increase-it) –  Jul 24 '18 at 18:37
  • @hop this is not a duplicate. See the answer below. – Erel Segal-Halevi Jul 24 '18 at 19:00

1 Answers1

2

Note that you can use your function as-is, but you need to ensure each fresh call doesn't have to recurse more than several hundred levels before it hits a cached value or recursive base case; e.g.,

>>> f(400)
80201
>>> f(800)  # will stop recursing at 400
320401
>>> f(1000) # will stop recursing at 800
500501

I've resorted to that at times ;-) More generally, you could write a wrapper function that repeatedly tries f(n), catches RecursionError, and backs off to calling it with ever-smaller values of n. For example,

def superf(n, step=400):
    pending = []
    while True:
        pending.append(n)
        try:
            f(n)
            break
        except RecursionError:
            n = max(n - step, 0)
    while pending:
        x = f(pending.pop())
    return x

Then

>>> superf(100000)
5000050001
Tim Peters
  • 67,464
  • 13
  • 126
  • 132
  • 2
    @hop, I doubt anyone is joking ;-) The OP appears to be a doctoral candidate trying lots of ideas. Recursive functions are often very natural in research, and `@lru_cache` can speed them enormously. When they work out well for small cases, you want to push to larger & larger cases. In my experience, most ideas eventually fizzle out. The goal is to write the least code necessary to evaluate an idea "fast enough", and every by-hand code optimization of an obvious recursive function risks introducing errors. – Tim Peters Jul 25 '18 at 17:14
  • @TimPeters: I'm not going up against _you_, but nothing was learned here today. –  Jul 26 '18 at 12:41