2

I do have the following logger class (as logger.py):

import logging, logging.handlers
import config

log = logging.getLogger('myLog')

def start():
    "Function sets up the logging environment."
    log.setLevel(logging.DEBUG)
    formatter = logging.Formatter(fmt='%(asctime)s [%(levelname)s] %(message)s', datefmt='%d-%m-%y %H:%M:%S')

    if config.logfile_enable:
        filehandler = logging.handlers.RotatingFileHandler(config.logfile_name, maxBytes=config.logfile_maxsize,backupCount=config.logfile_backupCount)
        filehandler.setLevel(logging.DEBUG)
        filehandler.setFormatter(formatter)
        log.addHandler(filehandler)

    console = logging.StreamHandler()
    console.setLevel(logging.DEBUG)
    console.setFormatter(logging.Formatter('[%(levelname)s] %(message)s')) # nicer format for console
    log.addHandler(console)

    # Levels are: debug, info, warning, error, critical.
    log.debug("Started logging to %s [maxBytes: %d, backupCount: %d]" % (config.logfile_name, config.logfile_maxsize, config.logfile_backupCount))

def stop():
    "Function closes and cleans up the logging environment."
    logging.shutdown()

For logging, I launch logger.start() once, and then import from logger import log at any project file. Then I just use log.debug() and log.error() when needed. It works fine from everywhere on the script (different classes, functions and files) but it won't work on different processes lanuched through the multiprocessing class.

I get the following error: No handlers could be found for logger "myLog".

What can I do?

iTayb
  • 12,373
  • 24
  • 81
  • 135
  • Where do you initialize the "second" process and is the second process execute the upside snippet of code? – DonCallisto May 19 '12 at 12:19
  • Well, I didn't initialize it again in the second process. Makes sense why the handlers couldn't be found. But I can't intilize a different logger object to the same file and expect both object will log correctly. Or am I wrong? – iTayb May 19 '12 at 12:25
  • 1
    maybe this is relevant? http://docs.python.org/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes – Not_a_Golfer May 19 '12 at 12:49
  • @Not_a_Golfer Thank you. please post it as answer so i can choose it. – iTayb May 19 '12 at 12:59

1 Answers1

9

from the python docs: logging to a single file from multiple processes is not supported, because there is no standard way to serialize access to a single file across multiple processes in Python.

See: http://docs.python.org/howto/logging-cookbook.html#logging-to-a-single-file-from-multiple-processes

BTW: What I do in this situation, is use Scribe which is a distributed logging aggregator, that I log to via TCP. This allows me to log all servers I have to the same place, not just all processes.

See this project: http://pypi.python.org/pypi/ScribeHandler

Not_a_Golfer
  • 47,012
  • 14
  • 126
  • 92