0

I have a Python module that performs some logging during some of the methods it contains:

module.py

LOG_FILE = "/var/log/module.log"

def log(message):
    with os.fdopen(os.open(LOG_FILE, os.O_RDWR | os.O_CREAT, 0664), "a+") as f:
        f.write("[%s] %s\n" % (time.strftime("%c"), message))

def do_something():
    log("Doing something")
    # ...

In this implementation the log file will be opened and closed every time the log method is called.

I'm considering refactoring it so the file is opened once when the module is loaded, but I'm not sure how to ensure it is closed when a script importing the module ends. Is there a clean way to do this?

Edit: I'm not asking about closing the file when an exception is encountered, but when the script that imports my module exits.

DanielGibbs
  • 9,910
  • 11
  • 76
  • 121
  • possible duplicate of [Do files get closed during an exception exit?](http://stackoverflow.com/questions/17577137/do-files-get-closed-during-an-exception-exit) – jhoepken Jun 26 '15 at 12:40
  • you don't have guarantees. However, if you are looking for a "best effort" behaviour, check `atexit` – MariusSiuram Jun 26 '15 at 12:42
  • Why not use `apache http server`,`mod_wsgi`,`logger` and let them handle the writes to log files. Faster, efficient.. or just use a `logging` [module](https://docs.python.org/2.3/lib/node304.html) – Vaulstein Jun 26 '15 at 12:48
  • @Vaulstein Because it's not running as a CGI script. – DanielGibbs Jun 26 '15 at 13:00

3 Answers3

2

OS takes care of open file descriptors then a process dies. It may lead to a data loss if file buffers inside the application are not flushed. You could add f.flush() in the log() function after each write (note: it does not guarantee that the data is physically written to disk and therefore it is still may be lost on a power failure, see Threadsafe and fault-tolerant file writes).

Python may also close (and flush) the file on exit during a garbage collection. But you shouldn't rely on it.

atexit works only during a normal exit (and exit on some signals). It won't help if the script is killed abruptly.

As @René Fleschenberg suggested, use logging module that calls .flush() and perhaps registers atexit handlers for you.

Community
  • 1
  • 1
jfs
  • 399,953
  • 195
  • 994
  • 1,670
1

Python is usually pretty good at cleaning up after itself. If you must do something when the script ends, you need to look at the atexit module - but even then, it offers no guarantees.

You may also want to consider logging to either stdout or stderr, depending on purpose, which avoids keeping a file around all together:

import sys

def log(message):
    sys.stderr.write("[%s] %s\n" % (time.strftime("%c"), message))
RoadieRich
  • 6,330
  • 3
  • 35
  • 52
0

Python will automatically close the opened files for you when the script that has imported your module exits.

But really, just use Python's logging module.

René Fleschenberg
  • 2,418
  • 1
  • 15
  • 11
  • So there's no need to explicitly close any files? I'll have a look at `logging` but at the moment it's quicker and easier to do it manually rather than messing around setting up paths, permissions, and formats. – DanielGibbs Jun 26 '15 at 13:00
  • 1
    Whether files are automatically closed at the end depends on the Python implementation, as far as I know. It is not a good practice to rely on it. – mkrieger1 Jun 26 '15 at 13:03
  • It's good practice to explicitly close files, but if you only want to close on program exit, you can let Python do that for you. – René Fleschenberg Jun 26 '15 at 13:04
  • 1
    I recommend looking at the logging module *now*. It may seem like your current method is quicker and easier, but it is actually harder and more complicated, which is why you are asking this question ;) – René Fleschenberg Jun 26 '15 at 13:07