0

I want to save my python program output in a log file and still be able to see it in the console.

My program is running every 30 minutes and after 30 minutes the process of my program is getting killed my BATCH file that force closing it.

Therefore, I can't use solution that show my program output and save it to a log file in the end of the program because there is no 'ending' because it getting killed in the middle of the runtime by the BTACH file.

There is any solution that will be able to show me the program output and write it LIVE to the log file? and by that even when I will kill the process (and the program), I will get all the output till the point it is getting killed to the LOG file?

vvvvv
  • 25,404
  • 19
  • 49
  • 81
NatiFo
  • 125
  • 1
  • 1
  • 5

1 Answers1

2

I also wanted to have logging in the console while running my python script, but also wanted to have it in a log file for later or parallel reference. In addition I wanted to be able to set different levels of output for console and file. For instance in the file, I wanted to have all levels (DEBUG, INFO, WARNING, ERROR, CRITICAL), but in the terminal I only wanted to have WARNING, ERROR, CRITICAL. In addition I wanted to change, if I append to the log file, or overwrite it.

  1. Create a file and call it logging_handler.py for instance with the follwing content:
import logging
from pathlib import Path


def logger(
    stream_logger=dict(),
    file_logger=dict(),
    logger_name="",
) -> logging.Logger:
    """
    Set up and returns a logger with specified formatting and handlers.

    If you work with a package which implements the Python's standard logging module you
    can provide the namespace of that package's logging. (e.g., "sqlalchemy.engine",
    "sqlalchemy.pool" etc.)
    If you create several loggers the "mode" might interfere.

    Parameters
    ----------
    stream_logger: dict. The valid keys and values are:
        - active: bool
        - level: str. The logging level for the stream logger. Must be one of: "DEBUG",
            "INFO", "WARNING", "ERROR", "CRITICAL".

    file_logger: dict. The valid keys are:
        - active: bool
        - level: str. The logging level for the file logger. Must be one of: "DEBUG",
            "INFO", "WARNING", "ERROR", "CRITICAL".
        - fname: str. The filename of the log file (without extension). Will be stored
            in logs/{fname}.log
        - mode: str. The file mode to use when opening the log file. Must be one of:
            "a" (append) or "w" (overwrite).

    logger_name: str, optional, default ""
        String that will be used as the name of the logger instance.

    Returns
    -------
    logging.Logger
        The logger object.
    """
    formatter = logging.Formatter(
        "%(asctime)s - %(name)s - %(levelname)s - Line#: %(lineno)d in Function: %(funcName)s - Message: %(message)s",
        "%d.%m.%Y %H:%M:%S",
    )
    logger = logging.getLogger(logger_name)
    logger.setLevel(
        "DEBUG"
    )  # Leave this set to DEBUG, otherwise it will cut off the logging level of the handlers

    valid_levels = ("DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL")
    valid_modes = ("a", "w")

    if stream_logger["active"] and stream_logger["level"].upper() in valid_levels:
        stream_handler = logging.StreamHandler()
        stream_handler.setLevel((stream_logger["level"].upper()))
        stream_handler.setFormatter(formatter)
        logger.addHandler(stream_handler)
    elif not stream_logger["active"]:
        pass
    else:
        print("No valid logging level for stream")

    if (
        file_logger["active"]
        and file_logger["level"].upper() in valid_levels
        and file_logger["mode"].lower() in valid_modes
    ):
        Path("logs").mkdir(parents=True, exist_ok=True)
        file_handler = logging.FileHandler(
            filename=f"logs/{file_logger['fname']}.log",
            mode=file_logger["mode"].lower(),
        )
        file_handler.setLevel(file_logger["level"].upper())
        file_handler.setFormatter(formatter)
        logger.addHandler(file_handler)
    elif not file_logger["active"]:
        pass
    else:
        print("No valid logging level or file mode.")
    return logger


  1. In your main file, you import logging_handler, access that logger and give it some parameters:
import logging_handler

logging = logging_handler.logger(
    stream_logger={"active": True, "level": "debug"},
    file_logger={
        "active": True,
        "level": "debug",
        "fname": "output",
        "mode": "a",
    },
    logger_name="smooLogger",
)

See the description for the parameters in logging_handler.py.

Now you can put logging statements into your code like:

logging.debug("Debug message")
logging.info("Info message")
logging.warning("Warning message")
logging.error("Error message")
logging.critical("Critical message")

The result will look like:

13.04.2023 12:34:25 - DEBUG - Line#: 36 in Function: __init__ - Message: DB connection established and cursor created.

Hope that helps!

smoochy
  • 21
  • 2
  • Tried to run it and got an error: FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\User\\Project1\\build\\logs\\output.log' – NatiFo Apr 19 '23 at 22:30
  • Sorry, I only tested it in Mac, there it worked. But could confirm, that it was not working in Windows, when the folder was missing. I updated the code for `logging_handler.py`. Please copy the whole code again. – smoochy Apr 19 '23 at 23:14