2

The goal is to pass some data (a few bytes) from a child process to a parent process in Python (stdout and stderr can’t be used). The following is a simplified version of it (the actual code reads all data from the pipe before blocking on the subprocess).

import os
import subprocess
import sys

CHILD_SCRIPT = r'''
import os
os.write({wfd}, b'Hello, World!')
'''

rfd, wfd = os.pipe()
if hasattr(os, 'set_inheritable'):
    os.set_inheritable(wfd, True)
subprocess.check_call([sys.executable, '-c', CHILD_SCRIPT.format(wfd=wfd)])
print(os.read(rfd, 1024))

On Unix, it works on Python 2, but not on Python 3, where it fails with:

Traceback (most recent call last):
  File "<string>", line 3, in <module>
OSError: [Errno 9] Bad file descriptor
Traceback (most recent call last):
  File "test_inherit_fd.py", line 13, in <module>
    subprocess.check_call([sys.executable, '-c', CHILD_SCRIPT.format(wfd=wfd)])
  File "/usr/lib/python3.8/subprocess.py", line 364, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/bin/python3', '-c', "\nimport os\nos.write(5, b'Hello, World!')\n"]' returned non-zero exit status 1.

Apparently, the failure is unrelated to whether or not the FD is inheritable. On Python 3.2 (where FDs were still inheritable by default), it fails the same way. What else causes the difference? (EDIT: The reason is that, starting with Python 3.2, FDs above 2 are closed by default. The problem could be solved by passing close_fds=False).

On Windows, it fails on Python 2 and Python 3.

What’s a robust and clean way to pass data (a few bytes) from the child to the parent, that works on Python 2 and Python 3 on all platforms?

Manuel Jacob
  • 1,897
  • 10
  • 21

2 Answers2

0

From the subprocess documentation:

If close_fds is true, all file descriptors except 0, 1 and 2 will be closed before the child process is executed. Otherwise when close_fds is false, file descriptors obey their inheritable flag as described in Inheritance of File Descriptors.

pass_fds is an optional sequence of file descriptors to keep open between the parent and child.

To make it work, add either close_fds=False or pass_fds=[wfd] as an argument to subprocess.check_call. A quick test reveals that if you use pass_fds, then you won't need the call to set_inheritable anymore, but this isn't true if you use close_fds.

-1

I think what you're looking for is a multiprocessing.Queue.

Python 3: https://docs.python.org/3/library/multiprocessing.html#exchanging-objects-between-processes

Python 2: https://docs.python.org/2/library/multiprocessing.html#exchanging-objects-between-processes

You could also use a Pipe (from the same library) but it's a lot slower than a Queue: Python 3.4 multiprocessing Queue faster than Pipe, unexpected

EDIT

"How does that solve the question?"

Well you're trying to execute code in a child process and you need a way to get the data to a parent process. If you go look at the documentation you'll see a code snippet like this:

from multiprocessing import Process, Queue

def f(q):
    q.put([42, None, 'hello'])

if __name__ == '__main__':
    q = Queue()
    p = Process(target=f, args=(q,))
    p.start()
    print(q.get())    # prints "[42, None, 'hello']"
    p.join()

In this case f is the code that's going to get executed by the child process. You can do whatever you want there! Make a 1000 line function. Create a class, instantiate it and go nuts. Change the name from f to something meaningful. Put that function in a different file and import it and then cause it to be executed.

Mike Sandford
  • 1,315
  • 10
  • 22