I've been struggled with multiprocessing logging for some time, and for many reasons.
One of my reason is, why another get_logger.
Of course I've seen this question and it seems the logger that multiprocessing.get_logger returns do some "process-shared locks" magic to make logging handling smooth.
So, today I looked into the multiprocessing code of Python 2.7 (/multiprocessing/util.py), and found that this logger is just a plain logging.Logger, and there's barely any magic around it.
Here's the description in Python documentation, right before the get_logger function:
Some support for logging is available. Note, however, that the logging package does not use process shared locks so it is possible (depending on the handler type) for messages from different processes to get mixed up.
So when you use a wrong logging handler, even the get_logger logger may go wrong? I've used a program uses get_logger for logging for some time. It prints logs to StreamHandler and (seems) never gets mixed up.
Now My theory is:
- multiprocessing.get_logger don't do process-shared locks at all
- StreamHandler works for multiprocessing, but FileHandler doesn't
- major purpose of this get_logger logger is for tracking processes' life-cycle, and provide a easy-to-get and ready-to-use logger that already logs process's name/id kinds of stuff
Here's the question:
Is my theory right?
How/Why/When do you use this get_logger?