0

I would like to port the following structure (shown in bash) to python3 subprocess:

for i in 1 2 3
do
    echo $i
done | xargs -I {} echo bla {}

bla 1
bla 2
bla 3

As the example shows, multiple processes are run one after the other and the outputs are multiplexed as the input of a single process.

I specifically want to use pipes, not files/files on ramdisk, etc. It may not be possible.


I know of subprocess.PIPE, which can be used with subprocess.communicate(), I guess the solution is something like:

my_pipe = ?Pipe?()
for i in range(1, 4):
    subprocess.popen(['echo', '{}'.format(i)], stdout=my_pipe)

subprocess.popen(['xargs', '-I', '{}', 'echo', 'bla', '{}'], stding=my_pipe)

unfortunately I have no idea, what should ?Pipe? be, all I ever see is the subprocess.PIPE & co .


Could be important, I am looking for an answer that works on Linux.

The original question came up with imagemagick's convert miff:- | convert - . (It is arguable, if the simplified example I gave is really simpler.)


EDIT:

Maybe my example is not very clear, I try to create some ASCII-art:

My goal (unfolded "for" loop is represented):

-> is stdout; -> is stdin

echo 1 -->+
          |
echo 2 -->+
          |
echo 3 -->+
          |
          +-------> xargs
            3 2 1

So the output of the echo-s, the commands in the "for" loop are multiplexed into a single stdin of xargs.

I do not want to create a filter pipeline:

echo 1 ->+
         |
         +-> echo 2 ->+
                      |
                      +-> echo 3 ->+
                                   |
                                   +-> xargs

Possible workaround, but not what I want:

echo 1 ->+
         |
         +-> xargs

echo 2 ->+
         |
         +-> xargs

echo 3 ->+
         |
         +-> xargs
Zoltan K.
  • 1,036
  • 9
  • 19
  • What are you trying to achieve exactly? You can create your own pipes using [`os.pipe()`](https://docs.python.org/3/library/os.html#os.pipe) if that's what you're looking for. – zwer Jun 10 '18 at 22:19
  • At this point, I would like to know, if the described problem can be solved with subprocess. I would like use subprocess.popen over os.popen . os.pipe() does not seem to achieve that: `p=os.pipe(); subprocess.run(['ls'], stdout=p[0])` -> `ls: write error: Bad file descriptor` – Zoltan K. Jun 10 '18 at 23:24
  • @CrackerJack9 The link above is about _chaining_ the inputs and the outputs. I don't see, how can this be applied to this situation. – Zoltan K. Jun 11 '18 at 02:22
  • @ZoltanK. they can be "chained" together exactly as has been answered before, just as the shell does it by piping stdout from one command to stdin of the next – CrackerJack9 Jun 11 '18 at 03:32
  • @CrackerJack9 I reviewed the proposed answer and added some edits that should clarify, why I think it does not answer my question. Please let me know, if it does work, in more detail, because I cannot see it! – Zoltan K. Jun 12 '18 at 00:04
  • What about this: https://stackoverflow.com/a/50226642/977593 :D? – slackmart Jun 12 '18 at 00:10
  • @slackmart If my "for" loop would have exactly 1 cycle, it would be the solution. I want to execute a bunch of independent processes, one after the other, serialize all of their outputs into a single stream, and let another process read the whole stream. Running all those processes, collecting their outputs, use python code to concatenate them, and pack them in a file-like object is probably a valid solution. But what I really want to know, if I can make subprocess merge the data for me, with some clever syntax? – Zoltan K. Jun 12 '18 at 00:18

0 Answers0