I want to run command line in python, and capture the output. I can use subprocess.check_output
. But it will suppress the output, how can i do it without suppressing the console output?
Asked
Active
Viewed 402 times
0

demonguy
- 1,977
- 5
- 22
- 34
-
@KenjiNoguchi I need it to output on the console, i don't want to `print` the return value, because sometimes i want to debug if subprocess is hanging – demonguy Mar 03 '16 at 06:26
-
to support multiple pipes, see [Python subprocess get children's output to file and terminal?](http://stackoverflow.com/q/4984428/4279) (threads-based implementation + a link to an asyncio version) – jfs Mar 03 '16 at 12:02
-
related: [Displaying subprocess output to stdout and redirecting it](http://stackoverflow.com/q/25750468/4279) (the question from the [subprocess' tag description](http://stackoverflow.com/tags/subprocess/info)) – jfs Mar 03 '16 at 12:20
1 Answers
2
How about this?
from subprocess import Popen, PIPE
proc = Popen(["/usr/bin/nc", "-l", "9999"], stdout=PIPE)
buffer = []
line = proc.stdout.readline()
while line:
buffer.append(line)
print "LINE", line.strip()
line = proc.stdout.readline()
print "buffer", ''.join(buffer)
Using another terminal send some text
nc localhost 9999
# type something. the text should appear from the python code
Break the nc, and you get the output in buffer
as well

Kenji Noguchi
- 1,752
- 2
- 17
- 26
-
@demonguy: or more succinctly: [`for line in iter(proc.stdout.readline, b''): print line,` (note: comma at the end)](http://stackoverflow.com/a/17698359/4279). Unrelated: you could use `socket` module instead of `nc` command. – jfs Mar 03 '16 at 11:58
-
But i see document like this `Warning Use communicate() rather than .stdin.write, .stdout.read or .stderr.read to avoid deadlocks due to any of the other OS pipe buffers filling up and blocking the child process.` how to avoid it? – demonguy Mar 04 '16 at 09:06
-
@demonguy: it doesn't apply if all you need is to read a *single* pipe. Look at the links I've provided in the comments that use threads, async.io to consume *multiple* pipes, to avoid the deadlock. – jfs Mar 04 '16 at 13:37