I would like to port the following structure (shown in bash
) to python3 subprocess:
for i in 1 2 3
do
echo $i
done | xargs -I {} echo bla {}
bla 1
bla 2
bla 3
As the example shows, multiple processes are run one after the other and the outputs are multiplexed as the input of a single process.
I specifically want to use pipes, not files/files on ramdisk, etc. It may not be possible.
I know of subprocess.PIPE, which can be used with subprocess.communicate(), I guess the solution is something like:
my_pipe = ?Pipe?()
for i in range(1, 4):
subprocess.popen(['echo', '{}'.format(i)], stdout=my_pipe)
subprocess.popen(['xargs', '-I', '{}', 'echo', 'bla', '{}'], stding=my_pipe)
unfortunately I have no idea, what should ?Pipe?
be, all I ever see is the subprocess.PIPE
& co .
Could be important, I am looking for an answer that works on Linux.
The original question came up with imagemagick's convert miff:- | convert -
. (It is arguable, if the simplified example I gave is really simpler.)
EDIT:
Maybe my example is not very clear, I try to create some ASCII-art:
My goal (unfolded "for" loop is represented):
->
is stdout; ->
is stdin
echo 1 -->+
|
echo 2 -->+
|
echo 3 -->+
|
+-------> xargs
3 2 1
So the output of the echo
-s, the commands in the "for" loop are multiplexed into a single stdin of xargs
.
I do not want to create a filter pipeline:
echo 1 ->+
|
+-> echo 2 ->+
|
+-> echo 3 ->+
|
+-> xargs
Possible workaround, but not what I want:
echo 1 ->+
|
+-> xargs
echo 2 ->+
|
+-> xargs
echo 3 ->+
|
+-> xargs