1

I want to stop the execution of the exec or eval commands if they take too long to complete. I know how to do it using multiprocessing but I was wondering if there's an easier solution. Any ideas?

yyy
  • 912
  • 1
  • 9
  • 13
  • I think multiprocessing is the only way... – decadenza Jul 31 '18 at 16:41
  • You could set up a background thread that uses a `signal` to interrupt the main thread—unless the code that you’re `exec`ing could change the signal handlers. But `multiprocessing` is probably a better solution, and not significantly more complicated. – abarnert Jul 31 '18 at 17:16
  • If you do set it up with multiprocessing, consider making it a context manager, so you can do `with quit_after_seconds(30):` – Riley Martine Jul 31 '18 at 17:19

1 Answers1

2

Even though you said you can do it, here's my solution:

#!/usr/bin/env python3
"""Context manager to limit execution time."""

import multiprocessing
import time
from typing import Callable


def run_until(seconds: int, func: Callable, *args) -> None:
    """Run a function until timeout in seconds reached."""
    with multiprocessing.Pool(processes=2) as pool:
        result = pool.apply_async(func, [(*args)])
        try:
            result.get(timeout=seconds)
        except multiprocessing.TimeoutError:
            pass


if __name__ == "__main__":
    run_until(1, time.sleep, 20) # exits after 1 second
Riley Martine
  • 193
  • 2
  • 9
  • Wow, thanks! You're solution is better than the way I did it with multiprocessing. I can almost use that if only I figure out how to get the text that exec() prints out to the console while being run using apply_async. – yyy Jul 31 '18 at 20:56
  • Update: Found out how to get the console's output - the same as suggested here: https://stackoverflow.com/questions/3906232/python-get-the-print-output-in-an-exec-statement but put the "with stdoutIO() as s: exec(code)" line inside another function and call it with apply_async instead of exec. – yyy Aug 01 '18 at 06:05