I have an executable function called BOB.exe which prints an insane amount of text to stdout with only short pauses. BOB also has a habit of freezing so I wrote a python monitor function which, using the subprocess module, calls the BOB executable, directs it to a temporary file and watches the temp file's size to see if it has crashed. This is my current solution:
#!/usr/bin/python
from subprocess import Popen
import tempfile, time
def runBOB(argsList):
# Create a temporary file where BOB stdout will be piped
BOBout = tempfile.NamedTemporaryFile()
BOBoutSize = 0
# Start the subprocess of BOB
BOBsp = Popen(argsList, stdout=BOBout)
while True:
# See if subprocess has finished
if BOBsp.poll() is not None:
BOBout.close() # Destroy the temp file
return 0
# if the size of the stdout file has increased, BOB.exe is still running
BOBoutSizeNew = os.path.getsize(BOBout.name)
if BOBoutSizeNew > BOBoutSize:
BOBoutSize = BOBoutSizeNew
else: # if not, kill it
BOBsp.kill()
BOBout.close() # Destroy the temp file
return 1
# Check every 10 seconds
time.sleep(10)
However, this is incredibly slow and I think the writing to file is the cause. Is there a more efficient way to do this such as watching the stdout stream and then immediately sending it to Null? Anything that cuts down on the megabytes of printing being done would probably help. Is there another way of seeing if the exe has crashed? I should probably note that I don't care about the stdout, it is all going to be ignored anyway
Thank you for your help!