0

I have a Python script that executes two main tasks:

  • Execute code from the script itself
  • Launch a background process with multiprocessing.Process(target=...)

My question is: Is there any way of muting the stdout from that particular process without affecting the main process? I've tried to change it through sys.stdout, but it affects every single process and the main process (every instance of the program points to the same object):

>>> import multiprocessing
>>> import sys
>>> def a():
...     print('{} - {}'.format(sys.stdout, id(sys.stdout)))
... 
>>> for i in range(5):
...     multiprocessing.Process(target=a).start()
... 
<_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'> - 140230387621232
<_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'> - 140230387621232
<_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'> - 140230387621232
<_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'> - 140230387621232
>>> <_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'> - 140230387621232

>>> a()
<_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'> - 140230387621232

It is not possible to delete all print() statements from the function being executed by the process, since another routine of the program calls that function on the main process and it needs those print statements.

I also noticed that I could use a boolean flag to indicate whether the print should be executed or not, but I was hoping that anyone could give me a better approach.

Thank you very much!

kadamb
  • 1,532
  • 3
  • 29
  • 55
Cblopez
  • 446
  • 2
  • 12

2 Answers2

2

I've tried to change it through sys.stdout, but it affects every single process and the main process (every instance of the program points to the same object)

The solution you offered here actually works. Try running this simple example:

def a(no_stdout):
    if no_stdout:
        sys.stdout = None
    print(id(sys.stdout))

multiprocessing.Process(target=a, args=(False,)).start() # Outputs the id
multiprocessing.Process(target=a, args=(True,)).start()  # Outputs nothing

The reason that all the processes in your example print the same id is that multiprocessing.Process uses os.fork on your platform. When forking, all the process's memory remains identical (and even stays at the same physical address until it is modified by the child process). So although the addresses are the same, they each refer to a different sys.stdout object in each of the processes' memory.

kmaork
  • 5,722
  • 2
  • 23
  • 40
0

You have to do it inside the process, after fork. This is a bit crude, but seems to be working:

#!/usr/bin/env python3

import multiprocessing
import sys

class SinkLogger(object):
    def __init__(self):
        pass

    def write(self, message):
        pass

    def flush(self):
        pass  

def a(i):
    if (i % 2 == 1):
        sys.stdout = SinkLogger()
    print('{} - {} - {}'.format(i, sys.stdout, id(sys.stdout)))

for i in range(5):
    multiprocessing.Process(target=a, args=(i,)).start()

print("done")

Source of the idea: How to redirect stdout to both file and console with scripting?

teki
  • 2,032
  • 1
  • 12
  • 6
  • That seems to work. The object is not needed though, if you do `sys.stdout = None` it works as good, but how is that the `sys.stdout = something` from the main process changes all its child processes STDOUT (which is fine, because it should work like that), but a child process modifying his STDOUT won't change the other processes STDOUT (which is also fine but every `sys.stdout` is pointing to the same object!). Is the `multiprocessing` module doing those little changes under the table? – Cblopez Jun 01 '20 at 12:49