0

I have a program that needs to create several graphs, with each one often taking hours. Therefore I want to run these simultaneously on different cores, but cannot seem to get these processes to run with the multiprocessing module. Here is my code:

if __name__ == '__main__':
    jobs = []
    for i in range(5):
        p = multiprocessing.Process(target=full_graph)
        jobs.append(p)
        p.start()

(full_graph() has been defined earlier in the program, and is simply a function that runs a collection of other functions) Note: All output goes to .txt files If it helps, here is the output of the code (with the actual file locations changed):

runfile('file location', wdir='file directory')

Having given this output, the program still outputs nothing to the text file, and I have verified that the function that is being run works by itself. I have tried a few other methods, but this is the only one so far that has given an output. I am using the Spyder IDE with WinPython 3.6.3

J.Barker
  • 51
  • 1
  • 8
  • 1
    how about... after p.start() add p.join() cf. https://stackoverflow.com/questions/25391025/what-exactly-is-python-multiprocessing-modules-join-method-doing#25391156 – user8241626 Jan 06 '18 at 22:00
  • 1
    @user8241626 That shouldn't be needed. Joining is done implicitly here, since the processes are not daemonic. See the [docs](https://docs.python.org/3/library/multiprocessing.html?highlight=multiprocessing%20process#multiprocessing.Process.daemon). – bnaecker Jan 06 '18 at 22:33
  • How is the work split up across the processes? Are you sure they're not conflicting with one another in some way? – bnaecker Jan 06 '18 at 22:43
  • It is basically the same process being run multiple times. However, they all write to the same text file, could that cause a problem? – J.Barker Jan 07 '18 at 09:38

0 Answers0