0

In a bash script, I can access the pipes interchangeably. If I would like to do this from within a python script, how would I go about accessing sys.stderr?

Currently, I am doing the following:

#/usr/bin/env python
import sys

std_input = sys.stdin.readlines()
std_error = sys.stderr.readlines() # <-- fails, complaining that the file is 
                                   #     not open for reading. 

And I call the code from within a bash script:

#/usr/bin/env bash 
... | ./python-script.py

Where there exists both stdin and stderr content on the pipeline streaming into the python-script.py command.

There is no need for concurrency or anything fancy.

Chris
  • 28,822
  • 27
  • 83
  • 158
  • You need "concurrency or something fancy" to ensure you don't deadlock because you're blocking on a read to one pipe while the subprocess is blocking on a write to the other. Do you need the data *streamed* or is just *buffering* all the data and reading at the end sufficient? If the latter, `subprocess.run` is what you're looking for. – Daniel Pryden Mar 19 '19 at 22:19
  • 1
    Wait, you're streaming *into* the Python process? There is only one `stdin`. There is no standard stream for "error in", only "error out" (`stderr`). If you want to feed `stdout` and `stderr` from one process into `stdin` of another, you just need to redirect `stderr` into `stdout`. The data gets mixed into a single stream, which you can read on the other end from `stdin`. – Daniel Pryden Mar 19 '19 at 22:22
  • @DanielPryden sounds like I’ll need to use Bash as a wrapper then, thanks for explaining that – Chris Mar 19 '19 at 22:36

0 Answers0