6

The need:

  1. Timeout after X seconds, and kill the process (and all the processes it opened) if timeout reached before the process ends gracefully.
  2. Read ongoing output at runtime.
  3. Work with processes that produce output, ones that don't, and ones that produce output, and then stop producing it (e.g. get stuck).
  4. Run on Windows.
  5. Run on Python 3.5.2.

Python 3 subprocess module has timeout built in, and I've also tried and implemented timeout myself using timer and using threads, but it doesn't work with the output. readline() is blocking or not? readlines() is definitely waiting for the process to end before spitting out all the output, which is not what I need (I need ongoing).

I am close to switching to node.js :-(

martineau
  • 119,623
  • 25
  • 170
  • 301
Naphtali Gilead
  • 222
  • 4
  • 18
  • 2
    The problem with stdout might be in the child process. If the stdout buffer is not flushed then python will never receive the contents (and that would be the same whatever language you used). One possible solution (untested) would be in `subprocess.Popen` to assign the child's stdout to stderr. Usually stderr is unbuffered. – cdarke Aug 06 '16 at 06:27
  • 1
    Yes, `readline` will block, waiting to receive the next line, as will anything else that reads `sys.stdin`. You can tell Python to make `sys.stdout` unbuffered by specifying the `-u` option on the command line. – PM 2Ring Aug 06 '16 at 06:52
  • 8
    No one cares if you switch to node.js... – martineau Aug 06 '16 at 08:49

3 Answers3

3

I would use asyncio for this kind of task.

Read IO from the process like in this accepted anwser: How to stream stdout/stderr from a child process using asyncio, and obtain its exit code after?

(I don't want to fully copy it here)

Wrap it in a timeout:

async def killer(trans, timeout):
    await asyncio.sleep(timeout)
    trans.kill()
    print ('killed!!')

trans, *other_stuff = loop.run_until_complete(
                           loop.subprocess_exec(
                                SubprocessProtocol, 'py', '-3', '-c', 'import time; time.sleep(6); print("Yay!")' ,
                                        ) 
                       )

asyncio.ensure_future(killer(trans, 5)) # 5 seconds timeout for the kill
loop.run_forever()

Have fun ...

Community
  • 1
  • 1
Yoav Glazner
  • 7,936
  • 1
  • 19
  • 36
0

Use the 2 python script below.

  • The Master.py will use Popen to start a new process and will start a watcher thread that will kill the process after 3.0 seconds.

  • The slave must call the flush method if no newline in the data written to the stdout, (on windows the '\n' also cause a flush).

Be careful the time module is not a high precision timer.

The load time of the process can be longer than 3.0 seconds in extreme cases (reading an executable from a flash drive having USB 1.0)

Master.py

import subprocess, threading, time

def watcher(proc, delay):
    time.sleep(delay)
    proc.kill()

proc = subprocess.Popen('python Slave.py', stdout = subprocess.PIPE)
threading.Thread(target = watcher, args = (proc, 3.0)).start()

data = bytearray()

while proc:
    chunk = proc.stdout.read(1)
    if not chunk:
        break
    data.extend(chunk)
    print(data)

Slave.py

import time, sys

while True:
    time.sleep(0.1)
    sys.stdout.write('aaaa')
    sys.stdout.flush()
Community
  • 1
  • 1
Szabolcs Dombi
  • 5,493
  • 3
  • 39
  • 71
0

On Python 3.7+, use subprocess.run() with capture_output=True and timeout=<your_timeout>. If the command doesn't return before <your_timetout> seconds pass, it will kill the process and raise a subprocess.TimeoutExpired exception, which will have .stdout and .stderr properties:

import subprocess

try:
    result = subprocess.run(["sleep", "3"], timeout=2, capture_output=True)
except subprocess.TimeoutExpired as e:
    print("process timed out")
    print(e.stdout)
    print(e.stderr)

You might also want to pass text=True (or universal_newlines=True on Python <3.7) so that stdout and stderr are strs instead of bytes.

On older versions of Python you need to replace capture_output=True with stdout=subprocess.PIPE, stderr=subprocess.PIPE, in your call to subprocess.run() and the rest should be the same.

Edit: this isn't what you wanted because you need to wait for the process to terminate to read the output, but this is what I wanted when I came upon this question.

Boris Verkhovskiy
  • 14,854
  • 11
  • 100
  • 103