2

I have been trying to troubleshoot subprocess.PIPE with subprocesses with no luck.

I'm trying to pass commands to an always running process and receive the results without having to close/open the process each time.

Here is the main launching code:

launcher.py:

import subprocess
import time

command = ['python', 'listener.py']
process = subprocess.Popen(
    command, bufsize=0,
    stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)

# simulates sending a new command every 10 seconds
for x in range(1,10):
    process.stdin.write(b'print\r\n')
    process.stdin.flush()
    time.sleep(10)

listener.py:

import sys

file = open('log.txt', 'w+')
while True:
   file.write(sys.stdin.read(1))
file.close()

This is simplified to show relevent pieces. In the end I'll have threads listening on the stdout and stderr but for now I'm trying to troubleshoot the basics.

What I expect to happen: for each loop in launcher.py, the file.write() in listener.py would write. What happens instead: everything writes when the loop closes and the program terminates, or I SIGTERM / CTRL-C the script.

I'm running this in Windows 8 Python 3.4.

It's almost as if stdin buffers until the process closes and then it passes through. I have buffsize=0 set, and I'm flushing, so that doesn't make sense to me. I thought either one or the other would be sufficient.

The subprocess is running in a different process, so the sleep in launcher should have no impact on the subprocess.

Does anyone have any ideas why this is blocking?

Update 1: The same behaviour is also seen with the following code run from the console (python.exe stdinreader.py)

That is, when you type into the console while the program is running, nothing is written to the file.

stdinreader.py:

import sys
import os

file = open('log.txt', 'w+b')
while True:
    file.write(sys.stdin.read(1))
file.close()

Adding a file.flush() just before file.write() solves this, but that doesn't help me with the subprocess because I don't have control of how subprocess flushes (which would be my return subprocess.PIPE). Maybe if I reinitialize that PIPE with open('wb') it will not buffer. I will try.

Update 2: I seem to have isolated this problem to the subprocess being called which is not flushing after it's writes to stdout.

Is there anything I can do to force a flush on the stdout PIPE between parent and child without modifying the subprocess? The subprocess is magick.exe (imagemagick 7) running with args ['-script, '-']. From the point of view of the subprocess it has a stdout object of <_io.TextIOWrapper name='' mode='w' encoding='cp1252'>. I guess the subprocess will just open the default stdout objects on initialization and we can't really control whether it buffers or not.

The strange thing is that passing the child the normal sys.stdout object instead of subprocess.PIPE does not require the subprocess to .flush() after write.

Chris Edwards
  • 73
  • 2
  • 8
  • Not sure about this. But, in its default option, `POption` work with shell=False. It accept list of argument. Call it like this, `process = subprocess.Popen( ['python', 'listener.py'], bufsize=0, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT )` See if it changes anything – swdev Sep 22 '14 at 02:30
  • You're right. I actually had it as a list here already. Behaviour is the same. I've updated my code above per your comment. – Chris Edwards Sep 22 '14 at 07:39
  • Have you tried `file.flush()`? `sys.stdin.read(1)` might return empty string on EOF. `while True` loop is infinite in this case. How you would write Popen() call may depend on what actual command you are running. See [Python C program subprocess hangs at “for line in iter”](http://stackoverflow.com/q/20503671/4279) – jfs Sep 22 '14 at 07:43
  • Thanks J.F. We were both thinking the same think with file.flush(). You were right this does work with this example. I'll see if I can use that idea with the actual process to get it working and report back. – Chris Edwards Sep 22 '14 at 08:11
  • J.F. Thanks for the link. You're absolutely correct it sems to be a buffering issue in the subprocess. Your link provides answers that work for POSIX but I didn't see any solutions for Windows. Do you have any suggestions for Windows? It doesn't seem like it's possible http://stackoverflow.com/questions/3385427/disable-buffering-on-redirected-stdout-pipe-win32-api-c, – Chris Edwards Sep 22 '14 at 09:41
  • @ChrisEdwards: from [my answer](http://stackoverflow.com/a/20509641/4279): `setvbuf()` might work on Windows (perhaps it is called `_setvbuf()` there). I don't know whether `unbuffer`, `script`, `stdbuf` Windows ports can do their job. I'm not sure `pty` works on MinGW or Cygwin. There are Windows ports of `pexpect` module: winpexpect and wexpect -- I don't know whether they would work for your use-case. – jfs Sep 22 '14 at 11:56
  • *"The strange thing is that passing the child the normal sys.stdout object instead of subprocess.PIPE does not require the subprocess to .flush() after write."* -- [@tdelaney's answer](http://stackoverflow.com/a/25965652/4279) is about it: if `sys.stdout` is console then subprocess may use line-buffering, if it is a pipe/file then it uses block-buffering. – jfs Sep 22 '14 at 12:00
  • @j-f-sebastia: thanks. setbuf() worked for me in windows. I've upvoted you in your other post since it helped me get the answer I needed here. – Chris Edwards Sep 22 '14 at 12:06

3 Answers3

3

Programs run differently depending on whether they are run from the console or through a pipe. If the console (a python process can check with os.stdin.isatty()), stdout data is line buffered and you see data promptly. If a pipe, stdout data is block buffered and you only see data when quite a bit has piled up or the program flushes the pipe.

When you want to grab program output, you have to use a pipe and the program runs in buffered mode. On linux, you can trick programs by creating a fake console (pseudo tty, pty, ...). The pty module, pexpect and others do that.

On windows, I don't know of any way to get it to work. If you control the program being run, have it flush often. Otherwise, glare futilely at the Windows logo. You can even mention the problem on your next blind date if you want it to end early. But I can't think of anything more.

(if somebody knows of a fix, I'd like to hear it. I've seen some code that tries to open a Windows console and screen scrape it, but those solutions keep losing data. It should work if there is a loopback char device out there somewhere).

tdelaney
  • 73,364
  • 6
  • 83
  • 116
  • [I've just been made aware](https://stackoverflow.com/questions/19915834/utility-with-unredirectable-output-windows/19917090#comment98624378_19917090) of the existence of the [pseudoconsole API](https://learn.microsoft.com/en-us/windows/console/creating-a-pseudoconsole-session) on Windows 10 v1809. Might be of interest to you. – Harry Johnston May 05 '19 at 02:16
2

The problem was that the subprocess being called was not flushing after writing to stdout. Thanks to J.F. and tdelaney for pointing me in the right direction. I have raised this with the developer here: http://www.imagemagick.org/discourse-server/viewtopic.php?f=2&t=26276&p=115545#p115545

There doesn't appear to be a work-around for this in Windows other than to alter the subprocess source. Perhaps if you redirected the output of the subprocess to a NamedTemporaryFile that might work, but I have not tested it and I think it would be locked in Windows so only one of the parent and child could open it at once. Not insurmountable but annoying. There might also be a way to exec the application through unixutils port of stdbuf or something similar as J.F. suggested here: Python C program subprocess hangs at "for line in iter"

If you have access to the source code of the subprocess you're calling you can always recompile it with buffering disabled. It's simple to disable buffering on stdout in C:

setbuf(stdout, NULL)

or set per-line buffering instead of block buffering:

setvbuf(stdout, (char *) NULL, _IOLBF, 0);

See also: Python C program subprocess hangs at "for line in iter"

Hope this helps someone else down the road.

Community
  • 1
  • 1
Chris Edwards
  • 73
  • 2
  • 8
  • beware: an unbuffered output may have affect time performance negatively: writing each byte synchronously can be very slow (10-100 times slower). – jfs Sep 22 '14 at 12:13
-1

can you try to close the pipe at the end of listener.py? i think that is the issue

Kamyar Ghasemlou
  • 859
  • 2
  • 9
  • 24