0

With this example, I am able to start 10 processes and then continue to do "stuff".

import random
import time
import multiprocessing


if __name__ == '__main__':
    """Demonstration of GIL-friendly asynchronous development with Python's multiprocessing module"""

    def process(instance):
        total_time = random.uniform(0, 2)
        time.sleep(total_time)
        print('Process %s : completed in %s sec' % (instance, total_time))
        return instance

    for i in range(10):        
        multiprocessing.Process(target=process, args=(i,)).start()

    for i in range(2):
        print("im doing stuff")

output:

>> 
im doing stuff
im doing stuff
Process 8 : completed in 0.5390905372395016 sec
Process 6 : completed in 1.2313793332779521 sec
Process 2 : completed in 1.3439237625459899 sec
Process 0 : completed in 2.171809500083049 sec
Process 5 : completed in 2.6980031493633887 sec
Process 1 : completed in 3.3807358192422416 sec
Process 3 : completed in 4.597366303348297 sec
Process 7 : completed in 4.702447947943171 sec
Process 4 : completed in 4.8355495004170965 sec
Process 9 : completed in 4.9917788543156245 sec

I'd like to have a main while True loop which do data acquisition and just start a new process at each iteration (with the new data) and check if any process has finished and look at the output.

How could I verify that a process has ended and what is its return value? Edit: while processes in a list are still executing. If I had to summarize my problem: how can I know which process is finished in a list of processes - with some still executing or new added?

mooder
  • 341
  • 2
  • 8
  • Does this answer your question? [How can I recover the return value of a function passed to multiprocessing.Process?](https://stackoverflow.com/questions/10415028/how-can-i-recover-the-return-value-of-a-function-passed-to-multiprocessing-proce) – mkrieger1 Jun 02 '22 at 16:16
  • No, it uses the `.join()` method which blocks the execution until all processes are finished. I have processes that will be added while others are still executing. I want to see at the start of the loop if any in the list of processes are finished (then take them out of the list). – mooder Jun 02 '22 at 19:41
  • Creating a new process for each loop iteration is going to be quite slow, because just creating a process in the first place is quite an expensive operation. It would be better to use a processing pool `multiprocessing.Pool` where the worker processes live longer (for multiple "tasks") to reduce overhead – Aaron Jun 02 '22 at 20:29

0 Answers0