0

I have the nested loops below. How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script?

    for r in range(4):
        for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):

            # - Write Abaqus INP file - #
            writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])

            # - Delete LCK file to Enable Another Analysis - #
            delFile(aPath[k]+"/"+inpFiles[k]+".lck")

            # - Run Analysis - #
            runABQfile(inpFiles[k],aPath[k])

I tried using multiprocess.pool as but it never gets in:

            def parRunABQfiles(nA,nP,r,ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_):
            from os import path 
            from auxFunctions import writeABQfile, runABQfile 
            print("I am Here")
            for k in range( r*nA/nP, (r+1)*nA/nP ):
                # - Write Abaqus INP file - #
                writeABQfile(ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_)
                # - Delete LCK file to Enable Another Analysis - #
                delFile(aPath_+"/"+inpFiles[k]+".lck")
                # - Run Analysis - #
                runABQfile(inpFiles_,aPath_)
                # - Make Sure Analysis is not Bypassed - #
                while os.path.isfile(aPath_+"/"+inpFiles[k]+".lck") == True:
                      sleep(0.1)
            return k

        results = zip(*pool.map(parRunABQfiles, range(0, 4, 1)))

The runABQfile is just a subprocess.call to a sh script that runs abaqus

     def runABQfile(inpFile,path):    
         import subprocess
         import os

         prcStr1 = ('sbatch '+path+'/runJob.sh')

         process = subprocess.call(prcStr1, stdin=None, stdout=None, stderr=None, shell=True )

         return

I have no errors showing up so I am not sure why is not getting in there. I know because the writeABQfile does not write the input file. The question again is:

How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script?

David P.
  • 23
  • 7

1 Answers1

1

Use concurrent.futures module if multiprocessing is what you want.

from concurrent.futures import ProcessPoolExecutor

def each(r):
    for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):
        writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])
        delFile(aPath[k]+"/"+inpFiles[k]+".lck")
        runABQfile(inpFiles[k],aPath[k])

with ProcessPoolExecutor(max_workers=4) as executor:
    output = executor.map(each, range(4)) # returns an iterable

If you just want to "do" stuff rather than "produce", check out as_completed function from the same module. There are direct examples in the doc.

C Panda
  • 3,297
  • 2
  • 11
  • 11
  • @C Panda, Is `concurrent.futures` available for Python 2.7.10 ...it seems it is not. Unfortunately that is the version of python I can use in the cluster I am running. – David P. May 11 '16 at 18:53
  • @DavidP. using a custom process pool is messy. `concurrent.futures` does scary amount of things underneath to give you a threading like API. so, use a custom pool then. – C Panda May 11 '16 at 23:40
  • @C Panda, I am marking your answer as the solution. Thank you. However for my purposes, I combined it with the solution on [HERE](http://stackoverflow.com/questions/9874042/using-pythons-multiprocessing-module-to-execute-simultaneous-and-separate-seawa) – David P. May 13 '16 at 00:36