1

Is there a better way to create a timeout for a function in python?

Following pebble's documentation, I wrote a function where I pass my original function as an argument:

from pebble import ProcessPool
from concurrent.futures import TimeoutError

def timeout(timeout, function, *args):
    with ProcessPool() as pool:
        future = pool.schedule(function, args=args, timeout=timeout)

    try:
        result = future.result()
        return result
    except TimeoutError:
        raise TimeoutError
    finally:
        future.cancel()

It works fine, the only problem is that performance is terrible. Just to compare, If I run my original function without this "wrapping", it takes 1.5 seconds, while running it with this timeout function, takes over 8 seconds.

Any proposed solution should take into account that when timeout happens, it should end or cancel the running process and should not let it run indefinitely.

I'm running on Windows so signal is discarded.

Any suggestions?

Jlahamz
  • 51
  • 2
  • 1
    You can just use `time`? `time.sleep(10)` – fox tech. Aug 02 '22 at 20:17
  • 1: use a thread instead of a process (much lighter weight) 2: on unix, use [`signal.alarm`](https://docs.python.org/3/library/signal.html#signal.alarm) (there's possibly an alternative for windows, but I'd have to go find it). I'm not surprised creating an entire processing pool for calling a single function with a timeout has a huge overhead. – Aaron Aug 02 '22 at 20:19
  • Time.sleep won’t do any timeout function, it will only stop the execution for some time to continue later – Jlahamz Aug 02 '22 at 22:16
  • 1
    If I use a thread, how would I kill it after some time? – Jlahamz Aug 02 '22 at 22:17
  • Killing threads is a bad idea, as explained here: https://stackoverflow.com/questions/323972/is-there-any-way-to-kill-a-thread. You may want to consider asyncio, where tasks can be cancelled. It has other limitations, however, and may not be appropriate for your application. – Paul Cornelius Aug 05 '22 at 05:34

0 Answers0