0

I have a program that starts four others and I want them to run simultaneously which I can do with subprocess.Popen() and I want to check if they started correctly. If not, do nothing and wait until the next time it gets called. I'm working on windows and this is what I have so far but I'm not positive this works the way I think it will and I don't know how to test it other than sit around and hope it fails but continues on. Can someone tell me if this will work or not and if not why?

import subprocess
import time
from datetime import datetime

while 1:

    Day = time.strftime('%d')
    Month = time.strftime('%m')
    Year = time.strftime('%Y')  #if conditions are just right(the day changes before month and year are calculated) then the dates could get off (highly unlikely)
    start = time.clock()
    try:
        p1 = subprocess.Popen(['python', 'C:/Users/tnabrelsfo/Documents/Programs/strippers/TransmitterStrip.py'], stdout=None)
    except OSError:
        print('Error with Transmitter')
    try:
        p2 = subprocess.Popen(['python', 'C:/Users/tnabrelsfo/Documents/Programs/strippers/ReceiverStrip.py'], stdout=None)
    except OSError:
        print('Error with Receiver')
    try:
        p3 = subprocess.Popen(['python', 'C:/Users/tnabrelsfo/Documents/Programs/strippers/UIDStrip.py'], stdout=None)
    except OSError:
        print('Error with UID')
    try:
        p4 = subprocess.Popen(['python', 'C:/Users/tnabrelsfo/Documents/Programs/strippers/BlueStripper.py'], stdout=None)
    except OSError:
        print('Error with Blue')


    p1.wait()
    p2.wait()
    p3.wait()
    p4.wait()
    print('Duration: %.2f' % (time.clock()-start))
    print('\n' + str(datetime.now()) + '\n')
    print('Done with run')

    for x in range(301):
            time.sleep(1)
SirParselot
  • 2,640
  • 2
  • 20
  • 31
  • The process may start fine, but exit with an error. So while `p1.wait()` is waiting, one of the other processes may have died. Do you need immediate feedback in that case? – Eryk Sun Oct 08 '15 at 20:05
  • @eryksun no I do not need immediate feedback. I just need the program to not die if one of the child processes fails. – SirParselot Oct 08 '15 at 20:22
  • If `stdout=None` is an attempt to discard the output then use `stdout=subprocess.DEVNULL` instead on Python 3 (same for `stderr`). – jfs Oct 09 '15 at 08:25
  • here's [how you could run multiple processes in parallel and check their exit statuses](http://stackoverflow.com/a/12102040/4279) – jfs Oct 09 '15 at 08:27
  • how exactly does your parent script die? (it shouldn't even if child scripts fail) – jfs Oct 09 '15 at 08:28
  • Have you considered [to import the modules and calling the corresponding functions instead of using `subprocess` here](http://stackoverflow.com/a/30165768/4279)? – jfs Oct 09 '15 at 08:29
  • @J.F.Sebastian I am using 2.7 and the output is working the way I want. Also when the child doesn't execute correctly it is throwing errors and kills everything for some reason and yes I thought about just calling them as functions but I need them to run in parallel and to be able to wait on them to finish. – SirParselot Oct 09 '15 at 12:09
  • Don't use PIPE unless you consume the pipe otherwise your child process may hang. [To discard output on Python 2](http://stackoverflow.com/q/11269575/4279). Update your question and provide the full traceback that demonstrates how you think an error in a child process kills the parent. – jfs Oct 09 '15 at 12:16
  • @J.F.Sebastian I noticed PIPE was making it hang and I don't remember what the error was but it was definitely killing my parent process. I used the first link you provided which makes the most sense to me. I'm not sure what was causing the errors in the first place. – SirParselot Oct 09 '15 at 12:24

0 Answers0