0

I have a program which takes hours to complete and has to integrated into an existing procedure. I have a script which runs the program through the subprocess module and this works fine. But there is no way to tell how far the program has advances. The program does output some real-time information to stdout so I thought I could do something with reading from the pipes.

However I can`t get it to workycorrectly. It seems my script blocks on reading from the pipe in realtime.

I have made a simple script which demonstrates this:

import subprocess

worklist = [
    {
        'name' : '1: ',
        'cmd'     :r'/python27/python.exe printer.py',
        'pid'     :None
    },{
        'name' : '2: ',
        'cmd'     :r'/python27/python.exe printer.py',
        'pid'     :None
    },{
        'name' : '3: ',
        'cmd'     :r'/python27/python.exe printer.py',
        'pid'     :None
    }
]

for work in worklist:
    work['pid'] = subprocess.Popen(work['cmd'], stdout=subprocess.PIPE, stderr=subprocess.PIPE,bufsize=0)


while True:
    for work in worklist:
        for line in work['pid'].stdout:
            print work['name'] + str(line)

    if all(item['pid'].poll() is not None for item in worklist):
        break

for work in worklist:
    work['pid'].communicate

the printer.py contains:

from time import sleep
print 'this is a process line 1'
sleep(1)
print 'this is a process line 2'
sleep(1)
print 'this is a process line 3'
sleep(1)
print 'this is a process line 4'
sleep(1)
print 'this is a process line 5'
sleep(1)
print 'this is a process line 6'
sleep(1)
print 'this is a process line 7'
sleep(1)
print 'this is a process line 8'
sleep(1)
print 'this is a process line 9'
sleep(1)

so What I would expect to see is a that it prints the output of every 'printer.py' I called in real-time. However it just prints it out all at the same time once it has finished the process.

Is there some way around this by using only the subprocess module or other python build-ins?

I am on a windows machine so I can't use pexpect, plus since this will have to be used on a lot of systems I don't want to introduce dependencys... or at least as little as possible

  • 1
    Did you already read: http://stackoverflow.com/questions/375427/non-blocking-read-on-a-subprocess-pipe-in-python ? – Bakuriu Jun 15 '14 at 13:22
  • 1
    Have you made sure that the `print` output in `printer.py` is not buffered when the output is to a pipe? If you wrote more data with each print (so that the standard I/O buffer was full each time — that might be 512 or more bytes), then you might see the behaviour you expect. By default, the output is fully buffered when the output device is not an interactive terminal (e.g. when it is a pipe). – Jonathan Leffler Jun 15 '14 at 13:45
  • @JonathanLeffler I added `msvcrt.setmode(sys.stdout.fileno(), os.O_BINARY)` to the printer.py with no difference... –  Jun 15 '14 at 14:12
  • I have got it working, and it works with the printer.py if I flush to stdout everytime I print. However since I want to use this with an external program where I can`t force a flush. Is there a way to force flush it from within the calling python script maybe every 10 seconds or so? –  Jun 15 '14 at 14:36
  • Binary vs text is orthogonal to buffered vs unbuffered (vs line-buffered). If you're using `msvcrt`, that suggests you're running on Windows. If so, I can't help at all. If you're on a Unix-like system, you may be able to use pseudo-ttys (ptys) to fool the external program into writing to an interactive device and therefore flushing the output using line buffering. I have no idea whether that translates into Windows, but I would assume there isn't a trivial mapping of Unix pty functionality to Windows. – Jonathan Leffler Jun 15 '14 at 15:11
  • sadly I am forced to use windows... –  Jun 15 '14 at 15:16

0 Answers0