0

I was building a program to be deployed on windows. But I built it on Linux. I use multiprocessing and everything worked fine on Linux. But in windows, the processes do not run simultaneously and when they terminate, the program hangs in join statements. I read this was a known bug fixed on 3.7 but I am on 3.9.4.

The code:

def worker(day, queue, team, daily_jobs):
    print(f"process {day} started")
    initial_solution = solve_for_routes(
        len(team), daily_jobs[day])
    print(f"process {day} ended")
    queue.put([day, initial_solution])

def main(pools, workday=6):
 

    queue = multiprocessing.SimpleQueue()
    team = ["bir", "iki", "üç"]
    processes = []

    for day in daily_jobs:
        if not daily_jobs[day].empty:
            processes.append(
                multiprocessing.Process(target=worker, args=(day, queue, team, daily_jobs)))

    for process in processes:
        process.start()

    for process in processes:
        process.join()

    for _ in processes:
        q = queue.get()
        plan[q[0]] = q[1]

    named_plan = convert_to_named(plan, workday, team)

    return named_plan

It prints:

process 0 started
process 0 ended
process 1 started
process 1 ended
process 2 started
process 2 ended
process 3 started
process 3 ended
process 4 started
process 4 ended
process 5 started
process 5 ended

And the program just hangs never making it further the join()

Again, this works perfectly fine on Linux.

  • 1
    How do you execute your main()? That part of your code is missing. – Hannu Apr 15 '21 at 20:45
  • You don't even need the `join`s. All you care about is reading the results from the queue. After you've read N results, you know the processes are done. – Tim Roberts Apr 15 '21 at 20:49
  • @Hannu I just call it with the parameters. Nothing unusual. – astackoverflowuser Apr 15 '21 at 20:52
  • @Tim Roberts You're right but that's not the only issue. The processes execute in order rather than at the same time. – astackoverflowuser Apr 15 '21 at 20:52
  • How do you know? If the task is small and you don't have a plethora of processors, it's quite possible each finishes before the next starts. If you put `time.sleep(5)` in the function, do you see the same result? – Tim Roberts Apr 15 '21 at 20:55
  • When I run your code, they run out of order. I think it's working fine. – Tim Roberts Apr 15 '21 at 21:00
  • @Tim Roberts the processes take about 2 minutes each. I will try it without the joins anyway, just in case. So the multiprocessing should work out of the box in windows? – astackoverflowuser Apr 15 '21 at 21:00
  • Yes. I just ran your code on my Win 10 box, with stubs for your functions, and it worked fine. – Tim Roberts Apr 15 '21 at 21:04
  • @TimRoberts Well, removing the joins worked. Thanks a lot! – astackoverflowuser Apr 15 '21 at 21:46
  • you could also use concurrent.futures.ProcessPoolExecutor – DevLounge Apr 15 '21 at 23:31
  • Does this answer your question? [Python 3 Multiprocessing queue deadlock when calling join before the queue is empty](https://stackoverflow.com/questions/31665328/python-3-multiprocessing-queue-deadlock-when-calling-join-before-the-queue-is-em). **Calling `join` is *not* the problem; it's *when* you are calling `join` that is the problem.** – Booboo Apr 16 '21 at 15:25

0 Answers0