I want to redirect the output of (potentially multiple) processes started from within a python script to both stdout and a log file, pretty much like as the unix tool tee
does it. A the processes are rather long-lasting I would like to show/log the line when it is printed. Line buffering is ok but waiting for the process to finish before anything is printed to stdout is not an option.
I cannot find a way to achieve that though. When looking at subprocess.Popen()
I see that I can forward the output from stdout to a pipe, to a file descriptor or to a file object. It seems that I have to wait until the process ends when calling communicate()
to read from the pipe. I neither have a file descriptor and faking a file object does not work either as Popen seems to need a real file.
The only solution that I know so far is reading line-by-line from Popen as suggested in Python Popen: Write to stdout AND log file simultaneously. However, this works for a single process only. I would like to handle multiple processes launched from the python script and all should print to stdout and into the same log.
I especially focused on using subprocess.Popen
. Qt's QProcess
seems to be a potential solution by letting callbacks do the job but unfortunately I do not have a GUI application and I do not know how to run a Qt event loop within a linearly executed python script.
Is there any other way to capture output to stdout in a log file and at same time having live output to stdout?
Thanks!