I'm doing my best to close and clean up Queue
s when I'm done using them in order to collect output from a Process in Python's multiprocessing module. Here's some code which dies at some point due to "too many open files". What more can I do to clean up complete jobs/queues so that I can do as many as I like?
# The following [fails to] demonstrates how to clean up jobs and queues (the queues is key?) to avoid the OSError of too many files open.
def dummy(inv,que):
que.put(inv)
return(0)
from multiprocessing import Process, Queue, cpu_count
nTest=2800
queues=[None for ii in range(nTest)]
for ii in range(nTest):
queues[ii]=Queue()
job=Process(target=dummy, args=[ii,queues[ii]])
job.start()
print('Started job %d'%ii)
job.join()
print('Joined job %d'%ii)
job.terminate()
print('Terminated job %d'%ii)
queues[ii].close()
Because it's an OSError, there is no specific line in my code which causes the problem. The report looks like this:
...
Terminated job 1006
Started job 1007
Joined job 1007
Terminated job 1007
Started job 1008
Joined job 1008
Terminated job 1008
Started job 1009
Joined job 1009
Terminated job 1009
---------------------------------------------------------------------------
OSError Traceback (most recent call last)
<ipython-input-2-5f057cd2fe88> in <module>()
----> 1 breaktest()
... in breaktest()
/usr/lib64/python2.6/multiprocessing/__init__.pyc in Queue(maxsize)
/usr/lib64/python2.6/multiprocessing/queues.pyc in __init__(self, maxsize)
/usr/lib64/python2.6/multiprocessing/synchronize.pyc in __init__(self)
/usr/lib64/python2.6/multiprocessing/synchronize.pyc in __init__(self, kind, value, maxvalue)
OSError: [Errno 24] Too many open files
> /usr/lib64/python2.6/multiprocessing/synchronize.py(49)__init__()