I have a pretty simple setup like this:
while True:
model.fit(mySeqGen(xx), ..., use_multiprocessing=True, workers=8)
<stuff>
model.save_weights(fname)
gc.collect()
Which runs for a long time, but if left overnight I will find it generating OSError: [Errno 24] Too many open files
every loop iteration. The full stack trace is on another machine, but it has multiple references to multiprocessing.
This is surely not related to actual files, but byproducts of threads being created under the hood and not cleaned up properly. Is there a simple way I can make this loop stable over the long run and clean up after itself each pass?