I am developing a long-running python script which makes many connections to different serial ports. The script crashes a few hours into its execution citing "Too many open files".
I have tracked the issue to the serial module where the .close() method does not seem to reduce the number of file descriptors python is using. I am checking this using lsof | grep python | wc
. Using Debian 7.2 & Python 2.7.3
The example below slowly uses up more and more file descriptors until it hits the limit. Why is this and how can I avoid it??
#!/usr/bin/env python
import serial #Used to communicate with pressure controller
import logging
import time
from time import gmtime, strftime
logging.basicConfig(filename="open_files_test.log")
# Write unusual + significant events to logfile + stdout
def log( message ):
time = strftime("%Y-%m-%d %H:%M:%S", gmtime())
logging.warning( time + " " + message )
print( message )
for i in range(2000):
for n in range(1, 12):
try:
port_name = "/dev/tty" + str(n+20)
com = serial.Serial(port_name,9600,serial.EIGHTBITS,serial.PARITY_NONE,serial.STOPBITS_ONE,0.0,False,False,5.0,False,None)
com.open()
com.flushInput()
com.flushOutput()
log("Opened port: " + port_name)
except serial.SerialException:
com = None
log("could not open serial port: " + port_name)
com.close()
log("Closed port: " + port_name)
time.sleep(1)
log("Finished Program")
Thanks