4

Consider the following code:

def printlen(x): print(len(x))

def argslist(x): return list(x.args)

def add(value_error, arg): raise Exception(*value_error, *arg)

def fibonacci(a, b, x):
    try:
        assert b<x
        printlen(b)
        try:
            add(a,b)
        except Exception as e:
            fibonacci(argslist(e), a, x)
    except AssertionError as e:
        print(e)

threshold = 1000
fibonacci([[]],[[]],[[]]*threshold)

The code prints out Fibonacci numbers from 1 to threshold using def, try, except and raise and two special functions. If you remove the "assert b < x" line or set threshold to a very big number, python.exe will take over the RAM and freeze everything. Isn't there a mechanism to limitnested raises, trys and excepts or just prevent so much memory from being allocated?

decorator-factory
  • 2,733
  • 13
  • 25
  • Definitely has nested exception handling; see answer here: https://stackoverflow.com/questions/19883531/nested-try-except-in-python/19887222 – gerowam Aug 30 '18 at 14:36
  • @gerowam I was asking whether there was a mechanism that would prevent too much nestedness. – decorator-factory Aug 30 '18 at 14:41
  • 1
    Ah -- something like exception unrolling, but with stack-based recursion. I'd guess no (as answer below implies) since Python is interpreted and the interpreter won't know how deep the recursion goes. Instead of running out of memory, you might be hitting the recursion depth limit. Maybe? – gerowam Aug 30 '18 at 14:50
  • @gerowam recursion depth is a runtime thing so it has nothing to do with "interpreted" vs "compiled". Also being compiled or interpreted is 'ot a property of a language but from a language implementation (there are C interpreters FWIW). And finally, CPython (the reference implementation) is compiled to bytecode so stating it is "interpreted" is not totally accurate... – bruno desthuilliers Jan 07 '19 at 20:39

2 Answers2

4

will take over the RAM

Technically, that's not possible.

RAM refers to physical memory and python.exe cannot access physical RAM. An executable in Windows only gets access to virtual memory (MSDN). How much of that is actually in RAM is decided by Windows.

For 32 bit Python, that should not be much of a problem, since it will only have access to a maximum of 4 GB. But with 64 bit of course, it will start using the page file and write to disk, which makes things slow. This will even affect other programs. For details, see memory limits (MSDN)

and freeze everything

Also, hardly possible, since Python cannot have exclusive access to the CPU. A mechanism called preemption will ensure that Windows can assign the CPU to something else.

The code you posted neither uses multithreading nor multiprocessing, so it will use only 1 CPU. If you have a single core machine, that may be at 100%, of course. On my machine, it's just 25% and everything else continues running smoothly:

25% CPU for Python

Isn't there a mechanism to limit nested raises, trys and excepts

I don't think so.

or just prevent so much memory from being allocated?

Yes, there's a mechanism called a "Job (MSDN)" in Windows which can do that. You would need to run Python.exe in a job which has a memory quota. There's a Superuser question which suggests tools for that.

As an alternative, you could of course implement a memory limit condition for the fibunacci sequence to stop, like this:

import os
import psutil
process = psutil.Process(os.getpid())
ram = psutil.virtual_memory().total
ram_limit = 70 / 100 * ram

[...]

if process.memory_info().private > ram_limit:
    assert False

I used that if for debugging reasons, so I could put a breakpoint onto the assertion to make the following screenshot.

Python at memory limit

As you can see, the private bytes (memory used by the process) is ~23.5 GB and the working set (memory that fits into the computer's physical RAM) is also ~23.5 GB. No performance issues so far.

Thomas Weller
  • 55,411
  • 20
  • 125
  • 222
  • there is a way to limit the number of recursions, sys.setrecursopnlimit https://docs.python.org/2/library/sys.html#sys.setrecursionlimit – Christian Sauer Aug 30 '18 at 15:05
  • @ChristianSauer: yes, but hardly applicable, since it will neither restrict RAM nor CPU. Also not reliable to get a predictable result, since this is not the recursion count of the fibunacci function, but recursion of all stack frames. Some methods are recursive, but not deep. – Thomas Weller Aug 30 '18 at 15:09
  • Thanks for the answer! The condition is already there: assert b – decorator-factory Aug 30 '18 at 15:26
2

No, there is no such mechanism. Python allocates memory while it is possible. When it becomes impossible, it raises MemoryError.

Tigran Saluev
  • 3,351
  • 2
  • 26
  • 40