0

i wrote a simple script. something like this:

import multiprocessing


my_list = []

def run_a_process():
  proc = multiprocessing.Process(target=worker)
  proc.start()
  my_list.append(proc)

def worker():
  # do something here
  ...

def close_done_processes():
  global my_list
  for idx, proc in enumerate(my_list):
     if not proc.is_alive():
        del my_list[idx]

def main():
  while True:

    if len(my_list) <= 10:
      for _ in range(10 - len(my_list)):
        run_a_process()

    if len(my_list):
          close_done_processes()

main()

i dont know about this example but the real program works just fine and no problem.

but after some days it will freeze without any error or anything. the program is running the interpreter is working on it but there will be no more logs and no more functionality. even ctrl+c wont stop the program. i think the problem is with the del my_list[id] part. i think its not removing the index from memory and not garbage collect it. so it will pile up and cause freeze of memory limitation??

i want to know how can i solve this issue??

i want to add items to the list and remove the ones that are already done processing from memory while keeping the other unprocessed items in the list without getting this freeze thing.

ShadowRanger
  • 143,180
  • 12
  • 188
  • 271
  • 2
    You're deleting from the list while iterating over it, which can cause problems. Here, it means that you may accidentally remove non-dead processes from the list, and leave dead ones; although it's hard to determine which exactly is happening. See [here](https://stackoverflow.com/questions/6260089/strange-result-when-removing-item-from-a-list) for a explanation. I don't know if that's your problem here, but it certainly won't behave as you expect in all cases. – Carcigenicate Aug 24 '20 at 14:32

1 Answers1

2

You've got a few problems here:

  1. As written, on Windows this code should fill your machine with a nigh infinite number of processes in seconds (at least according to the documentation; you may be avoiding this by luck). You need the if __name__ == '__main__': guard around your invocation of main to prevent it:

    if __name__ == '__main__':
        main()
    
  2. Your code for cleaning up the list is broken (mutating a collection while you iterate over it is a bad idea), and will delete the wrong elements of the list when there are two or more elements to remove (so some might never get deleted if there's always an element before them that's deleted first)

  3. You're not actually joining the dead processes, which delays cleanup of Process resources for a potentially significant (indefinite?) period of time.

To fix #2 and #3, the easiest solution is to just build a new list of alive processes to replace the existing list, and join the ones that aren't alive:

def close_done_processes():
    global my_list
    new_list = []
    for proc in my_list:
        if proc.is_alive():
            new_list.append(proc)
        else:
            proc.join()
    my_list = new_list
ShadowRanger
  • 143,180
  • 12
  • 188
  • 271
  • thanks man, i did not expect an answer so quickly. in the real program i do use the `__main__` part but the thing about making a new list and joining the done processes might solve the issue i never tried that. i cant be sure unless i test it for two weeks of running. – mohammad fallah.rasoulnejad Aug 24 '20 at 14:51