I have built a tool using django to automate script execution. The tool is working fine but sometimes the scripts take too long to execute. I want to limit the time for which my tool can execute each script. I have found 2 approaches and implemented them but I am not sure which is the right one to use.
1.) Using the signal module
2.) Using multiprocessing
Here is the sample code for both approaches
1.) Using the signal module
import signal
from contextlib import contextmanager
class TimeoutException(Exception): pass
@contextmanager
def time_limit(seconds):
def signal_handler(signum, frame):
raise TimeoutException("Timed out!")
signal.signal(signal.SIGALRM, signal_handler)
signal.alarm(seconds)
try:
yield
finally:
signal.alarm(0)
try:
with time_limit(10):
long_function_call()
except TimeoutException as e:
print("Timed out!")
2.) Using multiprocessing
from multiprocessing import Process
from time import sleep
def f(time):
sleep(time)
def run_with_limited_time(func, args, kwargs, time):
p = Process(target=func, args=args, kwargs=kwargs)
p.start()
p.join(time)
if p.is_alive():
p.terminate()
return False
return True
if __name__ == '__main__':
print run_with_limited_time(f, (1.5, ), {}, 2.5) # True
print run_with_limited_time(f, (3.5, ), {}, 2.5) # False
The problem am facing with signal module is that signal only works in main thread. Want to know which is the better approach and why? Also if there is any approach I can use to alter the behaviour of signal module.