0

How would you dynamically change the file where logs are written to in Python, using the standard logging package?

I have a single process multi-threaded application that processes tasks for specific logical bins. To help simplify debugging and searching the logs, I want each bin to have its own separate log file. Due to memory usage and scaling concerns, I don't want to split the process into multiple processes whose output I could otherwise easily redirect to a separate log. However, by default, Python's logging package only outputs to a single location, either stdout/stderr or or some other single file.

My question's similar to this question except I'm not trying to change the logging level, just the logging output destination.

Cerin
  • 60,957
  • 96
  • 316
  • 522

1 Answers1

0

you will need to create a different logger for each thread and configure each logger to it's own file.

You can call something like this function in each thread, with the appropiate bin_name:

def create_logger(bin_name, level=logging.INFO):
    handler = logging.FileHandler(f'{bin_name}.log')        
    handler.setFormatter(logging.Formatter('%(asctime)s %(levelname)s %(message)s'))

    bin_logger = logging.getLogger(bin_name)
    bin_logger .setLevel(level)
    bin_logger .addHandler(handler)

    return bin_logger
Lior Cohen
  • 5,570
  • 2
  • 14
  • 30
  • I'm not sure this is practical. I want to redirect all logging. That includes logs created by packages whose code I don't manage, and therefore can't modify to use my own `create_logger`()`. – Cerin Mar 08 '21 at 15:09
  • It is more complicated but doable. You need to create a Handler with Filter per thread and add all handlers to the central log. all logging will be broadcast to all handler but will pass only the one with correct thread id. Each handler will have a different file. Look on this answer https://stackoverflow.com/a/55035193/3700626 – Lior Cohen Mar 12 '21 at 23:46