0

I have a directory which contains a list of files that need running. These are pretty computationally heavy tasks so I'm looking for the best way to run multiple files at once.My first function for task submission was:

def copasiSE_batch_run(self):
    os.chdir(self.IA_dir) #dir containing files
    count=0
    for i in glob.glob('*.cps'):
        os.system('CopasiSE '+i)
        count = count+1
        print '\n\n\n {}:\tProfile liklihood for {}  has been calcualted'.format(count,i)

This is fine and it works but only uses one process, waits until one file is finished running then runs the next. Since these are independent tasks It would be good if I could get more going at the same time. I then tried:

def copasiSE_batch_run_subprocess(self):
    os.chdir(self.IA_dir)
    for i in glob.glob('*.cps'):
        command='CopasiSE '+i
        subprocess.Popen( command)

This also works but gives me the opposite problem. I have 32 files that need running and this function seems to run them all at once so that my computer is basically unusable until it done. What would be the best way to find a good balance between number of running programs at once and having enough computer power left over to do some reasonably low intensive tasks?

CiaranWelsh
  • 7,014
  • 10
  • 53
  • 106

1 Answers1

1

import time and use the time.sleep(60). If you put this in your for loop with a reasonable amount of time in the parameters you can lessen the amount of processes running at once but still have more then one running at a time.

Jonathan
  • 86
  • 7
  • Hi Jonathan, thanks for the response. This is a good solution and works well but for completeness do you happen to know if there is a way to restrict the number of files being run to (say) 8 at a time, until all jobs are finished? Cheers – CiaranWelsh Mar 04 '16 at 10:17
  • I can give you an idea but I wouldn't know how to express it in code. you would have to use a loop or conditional statement. while commands.running >= 8: pause script – Jonathan Mar 04 '16 at 14:54