3

My file structure looks like this:

runner.py
scripts/
    something_a/
        main.py
        other_file.py
    something_b/
        main.py
        anythingelse.py
    something_c/
        main.py
    ...

runner.py should look at all folders in scripts/ in run the main.py located there.

Right now I'm achieving this through subprocess.check_output. It works, but some of these scripts take a long time to run and I don't get to see any progress; it prints everything after the process has finished.

I'm hoping to find a solution that allows for 2 things to be done somewhat easily:

1) Stream the output instead of getting it all at the end

2) Doesn't prohibit running multiple scripts at once

Is this possible? A lot of the solutions I've seen for running a Python script from another require knowledge of the other script's name/location. I can also enforce that all the main.py's have a specific function if that helps.

Pika Supports Ukraine
  • 3,612
  • 10
  • 26
  • 42
user3715648
  • 1,498
  • 3
  • 16
  • 25
  • 1
    I would either look at using the threading library (https://docs.python.org/3/library/threading.html) so that you can run multiple scripts at the same time and if one of them hangs it doesn't impact the rest; or if that doesn't suit use a timeout on your subprocess call to stop the job after a certain time. See https://docs.python.org/3/library/signal.html#signal.alarm – dwagon Apr 12 '19 at 18:56
  • Did you try https://docs.python.org/3/library/subprocess.html#subprocess.Popen.communicate? Do you need *stdout* and / or *stderr*? Also, could you please add what you have so far? – CristiFati Apr 12 '19 at 18:57
  • The main issue with subprocess isn't that it takes too long or that I can't parallelize it. It's just that I want to stream the process which doesn't seem to be easily supported with subprocess. – user3715648 Apr 12 '19 at 18:57
  • Have a look here for how to stream the output with subprocess: https://stackoverflow.com/questions/18421757/live-output-from-subprocess-command – Valentino Apr 12 '19 at 19:06

1 Answers1

2

You could use Popen to loop through each file and write its content to multiple log files. Then, you could read from these files in real-time, while each one is populated. :)

How you would want to translate the output to a more readable format, is a little bit more tricky because of readability. You could create another script which reads these log files, with Popen, and decide on how you'd like this information read back in a understandable manner.

""" Use the same command as you would do for check_output """
cmd = ''

for filename in scriptList:
   log = filename + ".log"
   with io.open(filename, mode=log) as out:
        subprocess.Popen(cmd, stdout=out, stderr=out)
David Silveiro
  • 1,515
  • 1
  • 15
  • 23