i wrote a simple script. something like this:
import multiprocessing
my_list = []
def run_a_process():
proc = multiprocessing.Process(target=worker)
proc.start()
my_list.append(proc)
def worker():
# do something here
...
def close_done_processes():
global my_list
for idx, proc in enumerate(my_list):
if not proc.is_alive():
del my_list[idx]
def main():
while True:
if len(my_list) <= 10:
for _ in range(10 - len(my_list)):
run_a_process()
if len(my_list):
close_done_processes()
main()
i dont know about this example but the real program works just fine and no problem.
but after some days it will freeze without any error or anything. the program is running the interpreter is working on it but there will be no more logs and no more functionality. even ctrl+c
wont stop the program. i think the problem is with the del my_list[id]
part. i think its not removing the index from memory and not garbage collect it. so it will pile up and cause freeze of memory limitation??
i want to know how can i solve this issue??
i want to add items to the list and remove the ones that are already done processing from memory while keeping the other unprocessed items in the list without getting this freeze thing.