1

I am just trying multiprocessing in Python and I got a problem.

from multiprocessing import Process

w = 4;
arr = []

def func(num):
    for i in range(num,50,w):
        arr.append(i)

if __name__ == '__main__':
    p1 = Process(target=func, args=(1,))
    p1.start()
    p2 = Process(target=func, args=(2,))
    p2.start()
    p1.join()
    p2.join()

After running the code I get empty values for 'arr' array.

UPDATE: Anyone who just want to figure a problem like this better use threading.

import threading

w = 4;
arr = []

def func(num):
    for i in range(num,50,w):
        arr.append(i)

if __name__ == '__main__':
    jobs = []
    jobs.append(threading.Thread(target=func, args=(1,)))
    jobs.append(threading.Thread(target=func, args=(2,)))
    jobs.append(threading.Thread(target=func, args=(3,)))
    for j in jobs:
        j.start()
    for j in jobs:
        j.join()
  • Does this answer your question? [Modify object in python multiprocessing](https://stackoverflow.com/questions/15857838/modify-object-in-python-multiprocessing) – jkr Apr 20 '22 at 21:13
  • Why did you delete this question: https://stackoverflow.com/questions/75197032/c-sharp-running-tasks-at-the-same-time-with-different-delays-forever#75197032 ? – Enigmativity Jan 22 '23 at 00:02

1 Answers1

0

Multiprocessing uses child processes, rather than threads.

You pickled the empty array arr, shipped it to 1st child, and discarded it. Then you did same with 2nd child process.

You will need to do some IPC inter process communication to synchronize children with parent. But this seems like an XY question, and your true concern is more complex than what you described. (We do appreciate that you went to the trouble of offering an MRE!)

tl;dr: No, this approach won't work, sub-processes are not threads.


Spoiler alert: the simplest and most powerful way of using multiprocessing tends to look like this:

    for result in pool( ... ):
        ...
J_H
  • 17,926
  • 4
  • 24
  • 44
  • Your comment was very useful! What's the best method to run functions like this in parallel mode in your opinion? – davidandsch Apr 20 '22 at 21:31
  • Multiprocessing can do many things. For _my_ use cases, I typically have lots of inputs to process, and would like to quickly grind through them using all cores. That's a good match for Pool. I turn a `for item in items:` processing loop into a pool iteration. Well, actually, I usually wrap it with tqdm() as well, to offer a hint about how patient I will need to be. – J_H Apr 23 '22 at 22:02