I spawn an amount of subprocesses and want to write the output to file and to console.
My current setup takes a dict of a file and a cmd which I want to process and then waits for it to finish, this works fine and the output is redirected to the file.
listofwork = [
{'cmd':'somecmd1', 'file':'somefilename1.log'},
{'cmd':'somecmd2', 'file':'somefilename2.log'},
{'cmd':'somecmd3', 'file':'somefilename3.log'}
]
logpath = '/logs'
child_processes = []
for work in listofwork:
with io.open(logpath + r'//' + work['file'],mode='wb') as out:
p = subprocess.Popen(work['cmd'], stdout=out, stderr=out)
child_processes.append(p)
for cp in child_processes:
cp.communicate()
Basically I want to have this and also show the output onscreen. I have found this question.
Python Popen: Write to stdout AND log file simultaneously
but I want to keep the streams seperate so I can track which output is currently being read and act accordingly.
So How can I write to log and to screen without combining the into the stdout instantly. Most convenient would be If it would write to some internal data structure that I can manually print to screen! I need to be able to print in real-time and not wait for the process to end with .(communicate) as the process takes hours.
I can`t use StringIO because it throws an error that it has no attribute fileno()