Can I run into problems when both processes want to write to console (stderr) (StreamHandler) at the same time in the following code (FileHandler and StreamHandler created globally)? Or will it be neglible (in terms the chance of screen/output distortion)?
This is the start of the main/primary process (MyFormatter is custom one with 6 digit precision):
logging.basicConfig(filename='primary.log', level=logging.DEBUG, # logger acts on >= DEBUG
format='[%(asctime)s.%(msecs)03d, %(levelname)s, %(threadName)-10s, %(module)s, %(lineno)d] %(message)s',datefmt='%Y-%m-%d %H:%M:%S')
console = logging.StreamHandler() # stderr, as stream is omitted
console.setLevel(logging.DEBUG)
formatter = MyFormatter(fmt='[%(asctime)s, %(levelname)s, %(threadName)-10s, %(module)s, %(lineno)d] %(message)s',datefmt='%Y-%m-%d %H:%M:%S.%f')
console.setFormatter(formatter)
logging.getLogger('').addHandler(console)
# fixing FileHandler's formatter that was created in basicConfig call
file_handler = [h for h in logging.root.handlers if isinstance(h , logging.FileHandler)]
for h in file_handler:
h.setFormatter(formatter)
This run function gets spawned as a new process (secondary), in which a FileHandler is created locally (but the StreamHandler is still available (globally) due to process duplication/cloning (copy of parent's address space), which makes multiprocessing possible, I assume):
def run(self):
fh = logging.FileHandler('secondary.log')
fmt = MyFormatter(fmt='[%(asctime)s, %(levelname)s, %(processName)s, %(threadName)-10s, %(module)s, %(lineno)d] %(message)s',datefmt='%Y-%m-%d %H:%M:%S.%f')
fh.setFormatter(fmt)
local_logger = logging.getLogger('secondary.log')
local_logger.addHandler(fh)
local_logger.warning(multiprocessing.current_process().name + ' SQL worker process started')
...
The post Python logging: different logging destination per process mentions:
"Suspect that your local logger is passing its log messages upwards to the top level, where it gets output to stdout." (1), which suggest that it may use the same stdout/stderr (and thus prone to race conditions without locking)?
As both processes use different filenames (and Logger objects/names) I assume there are no problems with regard to FileHandlers/file writes and race conditions (that require locking), I assume.
Also, in multiple modules the logging.getLogger(__ name __) is being called. Will I run into problems with respect to logging if the secondary process makes calls from these modules? For now, I think the second process has its own Logger object/name as called by logging.getLogger('secondary.log') so no issues here I think. However, I can imagine that if I call methods of these modules from the secondary process it may interfere with those Logger objects (names) with these module names associated with the main/primary process (for example, in case of propagation to ancestor Logger objects, like root that has the same filename associated with it), or will the process duplication prevent this?
Also, both processes have a MainThread as each process is running a single thread, so no multi-threading involved.
I read something about it here (but it is still not clear to me), for example:
Logging separate files for different Processes in Python
Python using basicConfig method to log to console and file
Edit: Now I am rethinking, the same issue may be possible with file outputs; When FileHandler - including its associated filename - is cloned in the secondary address space (via basicConfig), would it be possible that via propagation (to ancestors) to the root logger in the 2nd process (see (1) or Python doc), a file write to the same filename as in the 1st process will take place (since the basicConfig was called with that name and cloned)? While https://docs.python.org/3/library/logging.html suggest a period-separated hierarchical value (see Python packages structure), like foo.bar.baz, should be used, (1) suggests that any name may be a descendant of root (''). That's wny the solution may work as the topic starter did not use the hierarchical format, like foo.bar.baz.