11

Possible Duplicate:
Timeout on a Python function call

I want to implement that when the function took more than 90 seconds to complete it should return immediately when timeout. Is there any way to achieve that?

def abc(string):
    import re
    if re.match('some_pattern', string):
        return True
    else:
        return False

abc('some string to match')

Edited

Please download this test file. I have created a thread class and raise an exception within thread if timeout error occur. But thread is still alive because it prints i am still alive :) even after exception. Why an exception does not force the thread to stop??

Community
  • 1
  • 1
Aamir Rind
  • 38,793
  • 23
  • 126
  • 164
  • 2
    This kind of question usually gets answers involving threads, `signal.alarm()`, `signal.setitmer()` etc. Be careful with those answers -- they probably rely on Python exceptions being thrown, but I doubt a Python exception can interrupt `re.match()`. (Not sure about the right answer. Maybe use a subprocess, and kill it after 90 seconds.) – Sven Marnach Nov 28 '11 at 16:44
  • @aix: While the linked question is very similar, none of the answers there is really applicable here. Only the checked answer would work at all, but it would leave the stalled function running in the background and consuming CPU time. That's why I'm not voting to close. – Sven Marnach Nov 28 '11 at 18:09

2 Answers2

6

I've edited my post to use jcollado's idea which is simpler.

The multiprocessing.Process.join method has a timeout argument which you can use like this:

import multiprocessing as mp
import time
import logging  
import re

logger = logging.getLogger(__name__)

def abc(string, result, wait = 0):
    time.sleep(wait)
    result.put(bool(re.match('some_pattern', string)))

if __name__ == '__main__':
    logging.basicConfig(level = logging.DEBUG,
                        format = '%(asctime)s:  %(message)s',
                        datefmt = '%H:%M:%S', )
    result = mp.Queue()
    proc = mp.Process(target = abc, args = ('some_pattern to match', result))
    proc.start()
    proc.join(timeout = 5)
    if proc.is_alive():
        proc.terminate()
    else:
        logger.info(result.get())

    proc = mp.Process(target = abc, args = ('some string to match', result, 20))
    proc.start()
    proc.join(timeout = 5)
    if proc.is_alive():
        logger.info('Timed out')
        proc.terminate()
    else:
        logger.info(result.get())

yields

12:07:59:  True
12:08:04:  Timed out

Note that you get the "Timed out" message in 5 seconds, even though abc('some string',20) would have taken around 20 seconds to complete.

Community
  • 1
  • 1
unutbu
  • 842,883
  • 184
  • 1,785
  • 1,677
  • Will the operation in the subprocess be interrupted, or will it continue to consume CPU time? – Sven Marnach Nov 28 '11 at 17:59
  • @Sven: Your comment was probably before my edit -- `pool.terminate` will interrupt the subprocess. – unutbu Nov 28 '11 at 18:30
  • @SvenMarnach, Docs for `Pool.terminate()` say "Stops the worker processes immediately without completing outstanding work. When the pool object is garbage collected terminate() will be called immediately." and `Process.terminate()` says "Terminate the process. On Unix this is done using the SIGTERM signal; on Windows TerminateProcess() is used. Note that exit handlers and finally clauses, etc., will not be executed." so I think you can assume it does as needed here. – Duncan Nov 28 '11 at 18:39
  • @Duncan: Thanks. As unutbu said, my comment was before the edit. – Sven Marnach Nov 28 '11 at 18:59
  • @unutbu i tried your approach i works fine on simple function but when i used it in my main program o got following exception `Traceback (most recent call last): File "C:\Python27\lib\threading.py", line 552, in __bootstrap_inner self.run() File "C:\Python27\lib\threading.py", line 505, in run self.__target(*self.__args, **self.__kwargs) File "C:\Python27\lib\multiprocessing\pool.py", line 313, in _handle_tasks put(task) PicklingError: Can't pickle : attribute lookup thread.lock failed`. – Aamir Rind Nov 28 '11 at 19:32
  • @AamirAdnan: Arguments passed to `pool.apply_async` must be picklable. This is because mp.Pool uses `mp.queue.SimpleQueue`s to pass tasks and results to and from the worker process. Judging from the error message, it looks like some argument is not picklable. You can fix this by sending only [pickable arguments](http://docs.python.org/library/pickle.html#what-can-be-pickled-and-unpickled). For more help, we'd need to see your code. – unutbu Nov 28 '11 at 19:48
  • @unutbu you are genius, thanks for all help. please if you give a comment on my edited post that would be great to increase my knowledge against threading. – Aamir Rind Nov 28 '11 at 19:52
  • Thread execution must be linear. By this I mean, once you call t.start(), the ExcThread's `run` method is called and must be allowed to run linearly. You can not call the ExcThread's methods from the main thread to somehow inject new commands for it to run. Thomas Wouters, a python developer, says in [this post](http://stackoverflow.com/a/3810921/190597), "Threads aren't going to work at all, because there's no way to interrupt the separate thread doing the infinite loop". You don't have an infinite loop, but the main idea is the same: main threads can never interrupt other threads. – unutbu Nov 28 '11 at 20:17
0

One way to handle this is to put this task into a thread, and use a watchdog to kill it after 90 seconds have passed.

Here's a recipe at ActiveState.

Edit: Obviously, the recipe by itself is not the complete solution. You would either have a watchdog thread that checked every x seconds if the worker thread was done, or you would go to an event framework like Michael Ford's simple event framework.

Ryan Ginstrom
  • 13,915
  • 5
  • 45
  • 60
  • that is a good recipe please wait untill i implement it. Thanks for help. – Aamir Rind Nov 28 '11 at 17:21
  • This recipe is completely unapplicable for the given problem. It relies on the worker thread polling some flag. How would you make `re.match()` poll a flag? (-1) – Sven Marnach Nov 28 '11 at 18:03