2

I am using Python's ProcessPoolExecutor to run multiple processes in parallel and process them as any of them finishes. Then I look at their output and as soon as at least one of them gives satisfying answer I want to exit the program.

However, this is not possible since upon calling pool.shutdown(wait=False) I will have to wait for all active tasks in the pool to finish before I can exit my script.

Is there a way to kill all the remaining active children and exit? Also, is there a better way to stop as soon as at least one child returns the answer we are waiting for?

user1635881
  • 239
  • 3
  • 11

1 Answers1

1

What you're doing is not quite clear, but executors strike me as the wrong tool entirely.

multiprocessing.Pool seems much more suitable, and allows doing exactly what you're asking for: you can iterate on imap_unordered (or apply_async and poll the results), and once you have what you were looking for just terminate() the pool then join() it.

Masklinn
  • 34,759
  • 3
  • 38
  • 57
  • thanks! will `terminate()` also kill all submitted process? – user1635881 Jun 03 '20 at 11:13
  • 1
    That is, in fact, the entire point of `terminate`. The alternative is `close()` which prevents new tasks from being submitted and instructs worker to exit as they finish their tasks, but `terminate()` stops the workers even if they're mid-job. – Masklinn Jun 03 '20 at 11:56
  • from concurrent.futures import ProcessPoolExecutor, as_completed exc = ProcessPoolExecutor(max_workers=4) processes = [exc.submit(job, arg) for job, args in jobs_tuple] for job in as_completed(processes): if job.result() == 'meets_a_condition': break exc.shutdown(wait=False) ` – NONONONONO Aug 06 '21 at 11:56