I need to implement in Python a "supervisor" process that starts multiple subprocesses, and monitors their operation. One of tasks that must be solved is logging of error messages generated by them. After reading that question I have found the following solution:
#!/usr/bin/python
import subprocess
import threading
import time
import logging, logging.handlers
def log_subprocess_output(pipe):
with pipe:
for line in iter(pipe.readline, b''): # b'\n'-separated lines
myLogger.info('got line from subprocess: %r', line)
myLogger.info("Leaving output handler")
myLogger = logging.getLogger('MTEST')
myLogger.setLevel(logging.DEBUG)
myHandler = logging.FileHandler('log.txt')
formatter = logging.Formatter('%(asctime)s %(name)-15s %(levelname)-8s %(message)s')
myHandler.setFormatter(formatter)
myLogger.addHandler(myHandler)
myLogger.info('Test info message')
myLogger.debug('Test debug message')
myLogger.error('Test error message')
npar=[["test1","1.5","10"],
["test2","1.3","20"],
["test3","0.8","30"]]
for pars in npar:
# Let's start the external application
cmd=["./ext.py",]+pars
pd=subprocess.Popen(cmd,stderr=subprocess.PIPE)
# Now we should start the thread (process?) reading the stderr
th=threading.Thread(target=log_subprocess_output,args=(pd.stderr,))
th.start()
Where the external application generates programmable number of artifficial error messages with programmable period:
#!/usr/bin/python
import sys
import time
period=float(sys.argv[2])
number=int(sys.argv[3])
for i in range(0,number):
time.sleep(period)
sys.stderr.write(sys.argv[1]+" "+str(i)+'\n')
The presented solution works reliably (I have even tested it with HTTP connected remote logging server presented in that question). However, I dislike it due to the necessity to mix subprocess and threading modules. Is there any better solution for logging error messages from subprocesses?