1

I asked this question as continuation of Limit function execution in Python

I found a way to do it without threads etc. Just simple checking from time to time.

Here is my decorator:

def time_limit(seconds):
    def decorator(func):
        func.info = threading.local()

        def check_timeout():
            if hasattr(func.info, 'end_time'):
                if time.time() > func.info.end_time:
                    raise TimeoutException

        func.check_timeout = check_timeout

        @functools.wraps(func)
        def wrapper(*args, **kwargs):
            if not hasattr(func.info, 'end_time'):
                func.info.end_time = time.time() + seconds
            return func(*args, **kwargs)

        return wrapper

    return decorator

And usage:

@time_limit(60)
def algo():
  do_something()
  algo.check_timeout()
  do_something_else()

It works fine on localhost, but it fails on server apache with mod_wsgi and django.

  1. First problem. Notice hasattr? I should add it, because from time to time I got error '_thread.local' has no attribute end_time
  2. Why do I need threading.local? As @Graham Dumpleton pointed out, we can't have a single global end time as a subsequent request will come in and overwrite it. So if the first request hadn't finished, its end_time would get reset to whatever was set for the later request. The problem is that this approach doesn't help. Suppose I have following session of runs.

First run - before timeout occurs - runs perfectly

Second run - before timeout occurs - runs perfectly

Third run - timeout occurs - raises TimeoutException

All subsequent calls raise TimeoutException no matter was it or not.

It seems like all subsequent calls look at end_time copy of third run, and since there is Timeout, they also raise Timeout.

How can I localize end_time for each function call? Thank you.

EDIT: Thanks to @miki725 and @Antti Haapala I simplified my function and made it a simple class:

class TimeLimit(object):
    def __init__(self, timeout=60):
        self.timeout = timeout
        self.end = None

    def check_timeout(self):
        if self.end and time.time() > self.end:
            raise TimeoutException
        else:
            self.start()

    def start(self):
        if not self.end:
            self.end = time.time() + self.timeout

However, it is very inconvenient for me to pass timer to function, because algo is actually very complex recursive function. So, I did following:

timer = TimeLimit() # fails. I think because it is global
def algo()
  do_stuff()
  timer.check_timeout()
  do_another_stuff()
  sub_algo() # check inside it to
  algo()
  ...

Is there any way to make timer thread-safe. Is pseudoprivate _timer of any help?

Community
  • 1
  • 1
Paul R
  • 2,631
  • 3
  • 38
  • 72

2 Answers2

2

The problem is that you are adding end_time on the function object itself. Since each thread will import all of the Python modules, effectively you will only set end_time n times as number of threads you are running (which seems to be in your case 2).

To solve this you can either always set end_time in each thread, however that does not seem elegant to me since you are making a couple of assumptions about what will be executed.

Other solution is to use classes. That will allow to keep state in a class instance and hence this issue will not occur.

class ExecuteWithTimeout(object):
    def __init__(self, to_execute, timeout):
        self.to_execute = to_execute
        self.timeout = timeout
        self.end = None

    def check_timeout(self):
        if time.time() > self.end:
            raise TimeoutException

    def __call__(self, *args, **kwargs):
        self.end = time.time() + self.timeout
        result = self.to_execute(*args, **kwargs)
        self.check_timeout()
        return result

def usage():
    stuff = ExecuteWithTimeout(do_something, 10)()
    do_something_else(stuff)

Another approach is to use a context manager:

@contextmanager
def timeout_limit(timeout):
    end = time.time() + self.timeout
    yield
    if time.time() > end:
        raise TimeoutException

def usage():
    with timeout_limit(10):
        do_stuff()
    more_things()

or better yet you can combine the two!

class TimeLimit(object):
    def __init__(self, timeout=60):
        self.timeout = timeout
        self.end = None

    def __enter__(self):
        self.end = time.time() + self.timeout
        return self

    def __exit__(self, exc_type, exc_val, exc_tb):
        self.check_timeout()

    def check_timeout(self):
        if self.end and time.time() > self.end:
            raise TimeoutException

def algo():
    with TimeLimit(2) as timer:
        time.sleep(1)
        timer.check_timeout()
        time.sleep(1)
        timer.check_timeout()

update for your update:

timer = TimeLimit() # fails. I think because it is global
def algo():
    ...

Using the class as above does not help you since then the class will be a thread-level instance which puts you back to the initial problem. The problem is keeping thread-level state and so it does not matter whether you store it in a class or as function object attribute. Your functions should explicitly pass state around to the inner functions if those functions need it. You should not rely on using global state to do so:

def algo():
    with TimeLimit(2) as timer:
        do_stuff(timer)
        timer.check_timeout()
        do_more_stuff(timer)
        timer.check_timeout()
miki725
  • 27,207
  • 17
  • 105
  • 121
1

The problem is the hasattr guarding the setting of end_time:

  if not hasattr(func.info, 'end_time'):

    func.info.end_time = time.time() + seconds

the end_time is set only once for each thread. The threads are long-lived and serve many requests; now the absolute time limit is set once for each thread, but it is never cleared.


As for usefulness of this decorator, I think it is not clever; I'd say it is outright ugly. Why abuse the decorator, battle with thread-locals etc for something where a single closure would do:

def timelimit_checker(time_limit):
    end = time.time() + time_limit
    def checker():
        if time.time() > end:
            raise TimeoutException

    return checker

def sub_algo(check_limit):
    ...
    check_limit()
    ...

def algo():
    check_limit = timelimit_checker(60)
    ...
    check_limit()
    ...
    subalgo(check_limit)
  • I actually need separate end_time for separate thread/request? So, when two users access same page simultaneously, they have unique end_times. end_times shouldn't be shareable or what do you mean? Sorry, as I'm new in such things. I tried to remove hasattr, but, as I pointed out in 1, error raised '_thread.local' has no attribute end_time. What do you suggest? – Paul R Feb 14 '15 at 18:47
  • RESPONSE to your edit. When algo is run it invokes many functions, so this should be shareable between functions. like def algo(): algo.check() sub_algo() and def sub_algo(): algo.check(). – Paul R Feb 14 '15 at 18:49
  • then share the timelimit checker. – Antti Haapala -- Слава Україні Feb 14 '15 at 18:53
  • Thank you very much. I've got idea finally. I like your answer, but will choose another as best, since classes seem cleaner to me. – Paul R Feb 14 '15 at 19:37