1

Hi I have just started programming in python and I am trying to use subprocess.Popen to run multiple instances of a program that i compile using "make". But before i do a "make", I have to do some text processing and generate a set of files that "make" will use. Now I would like to run the same program with different generated files simultaneously and write the output of all the instances of the output of the program into the same file. Depending upon the number of instances, I will also have to generate that many text files. In essence, I want to do all operations below the first for loop, simultaneously, lets say 'n' times. Any help offered would be greatly appreciated :).

for mC in range(monteCarlo):
    print "Simulation Number",str(mC+1),"of",str(monteCarlo)
    L = numpy.zeros((1,4),float)
    W = numpy.zeros((1,4),float)
    i = 0
    j = 0
    with open("1t.sp", "r") as inFile:
        with open("2t.sp","w") as outFile:
            line = inFile.readline()
            while (line != ""):
                newLine = []
                for words in line.split():
                    if words.startswith("W="):
                        W[0,i] = float(words[2:].replace('n',''))*random.normalvariate(1,widthDeviation)
                        #print i,words,str('W='+str(W[i]).strip('[]')+'n').replace(" ","")
                        words = str('W='+str(W[0,i]).strip('[]')+'n').replace(" ","")
                        i = i+1
                    elif words.startswith("L="):
                        L[0,j] = float(words[2:].replace('n',''))*random.normalvariate(1,lengthDeviation)
                        #print j,words,str('L='+str(L[j]).strip('[]')+'n').replace(" ","")
                        words = str('L='+str(L[0,j]).strip('[]')+'n').replace(" ","")
                        j = j+1
                    newLine.append(words)
            #print newLine
                outFile.write(" ".join(newLine))
                outFile.write("\n")
                line = inFile.readline()
    outFile.close()
    inFile.close()
    openWrite.write(str(W).strip('[]'))
    openWrite.write(str(L).strip('[]'))
    call(["make"])
    fRate = (open("tf.log","r").readlines()[34]).split()[-2]
    cSect = (open("tf.log","r").readlines()[35]).split()[-2]
    openWrite.write("\t")
    openWrite.write(fRate)
    openWrite.write(" ") 
    openWrite.write(cSect)
    openWrite.write("\n")
openWrite.close()   
shrikanth
  • 91
  • 1
  • 1
  • 5

1 Answers1

1

If your system has multiple processors or cores you can take advantage of that by using the multiprocessing module to run Python functions concurrently:

import multiprocessing as mp

def run_mc(mC):
    print "Simulation Number", str(mC+1), "of", str(monteCarlo)
    ...
    call(["make"])
    fRate = (open("tf.log", "r").readlines()[34]).split()[-2]
    cSect = (open("tf.log", "r").readlines()[35]).split()[-2]
    return fRate, cSect

def log_result(result):
    # This is called whenever run_mc returns a result.
    # result is modified only by the main process, not the pool workers.
    fRate, cSect = result
    with open(..., 'a') as openWrite:
        openWrite.write('\t{f} {c}\n'.format(f = fRate, c = cSect))

def main():
    # mp.Pool creates a pool of worker processes. By default it creates as many
    # workers as the system has processors. When the problem is CPU-bound, there
    # is no point in making more.
    pool = mp.Pool()
    for mC in range(monteCarlo):
        # This will call run_mc(mC) in a worker process.
        pool.apply_async(run_mc, args = (mC), callback = log_result)

if __name__ == '__main__':
    main()
unutbu
  • 842,883
  • 184
  • 1,785
  • 1,677
  • Will this generate that many text files that are required for "make". In the sense that, how will make know which text file to take as input. Or am i missing something ?? – shrikanth Dec 21 '11 at 17:07
  • Oops, I had not paid attention to that. In `run_mc`, you could use `dirname=str(mC+1)` to make a new directory (`os.makedirs(dirname)`), change directory (`os.chdir(dirname)`) and do you work there. – unutbu Dec 21 '11 at 17:13
  • 1
    By the way, since you are using the `with`-block syntax, you do not need the explicit calls to `outFile.close()` and `inFile.close()`. The file handle is closed when the Python exits the block. – unutbu Dec 21 '11 at 17:17
  • Thanks a lot for helping me out. Now the only last question is, how do I write the values from the 'n' output logs into the same file. Do i just explicitly point them to the same file in the function definition itself or is there a better method. – shrikanth Dec 21 '11 at 17:24
  • Also, I find this useful...maybe this is the closest to what i wanted....http://stackoverflow.com/questions/2359253/solving-embarassingly-parallel-problems-using-python-multiprocessing – shrikanth Dec 21 '11 at 17:35
  • I've edited my answer to show one way to retrieve the `fRate` and `cSect` from the separate processes and write them all out to a file in the main process. I'm assuming that the order does not matter. Results are appended to the file as each separate process finishes. – unutbu Dec 21 '11 at 17:57