1

I have searched here extensively without much luck specifically in regards to my question.

Currently, my program has issues relating to file descriptor overflows (too many open files), nearly all of which are directing to my single log file. My program is set out in this fashion:

In my main:

    # Initializes program log handler
    log = Log()
    log.setup_custom_logger('root')

    logger = logging.getLogger('root')

Within the Log class:

def setup_custom_logger(self, name):
    """ Sets base log handler for the program and log entry formatting """

    # Create logger
    logger = logging.getLogger(name)

    log_level = self.getLogLevel()
    logger.setLevel(log_level)

    # Sets formatting, output file and handler
    handler = logging.FileHandler(
            os.getenv("HOME") + config.LOG_DIRECTORY + 'qtgtool.log')
    handler.setFormatter(self.getFormat(log_level))
    # Add handler to logger
    logger.addHandler(handler)

And within any other class in the program this is called in their init:

    logger = logging.getLogger('root')

I have checked, and there is only ever one FileHandler object and that is used by all classes. As such, I have no idea why it would create so many file descriptors of my log when class objects are created... Or am I missing something? Is it a case of too many class objects all with logging capabilities?

The output from the traceback:

IOError: [Errno 24] Too many open files:/path/to/file/being/read.txt

This corresponds with lsof -p indicating 1024 open files (nearly all my log file). As a side note, I have seen the options for increasing the open file count. I do not want to do this, as I find this to totally miss the point of trying to fix the problem.

Further debugging by using this class instead of logging.FileHandler():

class FHandler(logging.FileHandler):

    def __init__(self, filename, encoding=None, delay=0):
        logging.FileHandler.__init__(self, filename, 'w', encoding, delay)

    def _open(self):
        logging.FileHandler._open(self)        
        print 'Opened file'

This prints a single 'Opened file' to the console, whilst lsof (already, early stages of the program) presents over 100 log file references.

Many thanks and apologies for the deficiencies in my programming syntax.

dpshi1
  • 21
  • 3
  • What is the exact error message (traceback) that you get? – DonCristobal Jan 11 '17 at 08:46
  • @DonCristobal I have added it to the question. The rest of the traceback is just three lines relating to parent .py processes that have called on each other sequentially prior to the final process failing. – dpshi1 Jan 11 '17 at 12:00
  • I tried the following: setup a logger as per your code in my main module, get logger with logging.getLogger, import another module with a class whose init tries to access the same logger using logging.getLogger('root'). lsof -p did not show the logger file more than once, i.e. the file was only opened once. Could there be something else going on in your code, maybe you instantiate Log() from somewhere else? (Plus, just to clarify, I assume the '/path/to/file/being/read.txt' refers to the qtgtool.log file?) – DonCristobal Jan 11 '17 at 15:27
  • @DonCristobal It's happened on multiple occasions, but frustratingly I have noticed today that every so often it does not happen and only one instance is shown. The open file is just another file I was attempting to open which then put the program over the limit. However the log file takes up 99% of open file descriptors. I've checked through and there are no other instances of setup_custom_logger. One other of Log(), but __init__ is passed. I am running a linux ubuntu gnome system if that makes any difference. – dpshi1 Jan 11 '17 at 16:22
  • Maybe put a breakpoint on FileHandler._open() in logging/__init__.py, and see when he gets there? That is where he opens the log file for writing (in 'a' mode). (You do not delete the log file during processing, or do you? Maybe he has to reopen the file handle when the log file disappears (for whatever reason), or something like that.) – DonCristobal Jan 11 '17 at 16:33
  • 1
    @DonCristobal Added the results above. I presume that's what you were after? Sorry if not, I'm relatively new to python... I make only one other reference to the log and that is for my log window where I use QProcess with 'self.process.start('tail -n ' + lines + ' -F ' + self.matches[0])'. This hasn't been initialized by this stage, however. – dpshi1 Jan 12 '17 at 12:26
  • Well my idea was more to (with the debugger) set up an actual breakpoint in the source code of the logging module and observe when the file open routine gets hit there. (If you don't know how to do this, here is some info for PyCharm: https://www.youtube.com/watch?v=QJtWxm12Eo0 ) But what you have done is a good idea too. I'll think about what you said and write later. – DonCristobal Jan 12 '17 at 12:48
  • @DonCristobal Thanks, let me know. I'm starting to think that perhaps it's some sort of system environment issue/difference when starting the program than perhaps the program itself... But I would say that given I developed half of it :D – dpshi1 Jan 12 '17 at 13:35
  • Hmmmm... just reading the the -f and -F option for tail...: −f The −f option causes tail to not stop when end of file is reached, but rather to wait for additional data to be appended to the input. The −f option is ignored if the standard input is a pipe, but not if it is a FIFO. −F The −F option implies the −f option, but tail will also check to see if the file being followed has been renamed or rotated. The file is closed and reopened when tail detects that the filename being read from has a new inode number. The −F option is ignored if reading from standard input rather than a file. – DonCristobal Jan 12 '17 at 13:40
  • Could it be that every call to tail reopens a file (and does not close it again)...? – DonCristobal Jan 12 '17 at 13:41
  • @DonCristobal Interesting. I'll isolate the log window from the program and assess if it continues to occur. – dpshi1 Jan 12 '17 at 14:08
  • @DonCristobal No such luck I'm afraid. Same thing. What I have just noticed, which may be of useful information, is that if I am to log off the server, and then log back on using the normal ssh and re-run the program, it will run cleanly, at least for a time. Then at some point it will change, and every time I run the program thereafter it will run 'dirtily' with large numbers of the file specifiers... – dpshi1 Jan 12 '17 at 14:16
  • Just to be clear, you are completely removing the self.process.start statement, and still 100s of files are opened? – DonCristobal Jan 12 '17 at 15:37
  • @DonCristobal I commented out the whole class (class initiation reference) relating to the log window, which held that code within it. There is no other reference to the class. – dpshi1 Jan 12 '17 at 16:00

1 Answers1

0

Please try the following code to close the file:

logger.removeHandler(handler)

"handler" is the created "FileHandler" instance.

From here.

gangmax
  • 71
  • 1
  • 6