Python Logging - Check Location of Log Files

Python logging - check location of log files?

The logging module uses handlers attached to loggers to decide how, where, or even if messages ultimately get stored or displayed. You can configure logging by default to write to a file as well. You should really read the docs, but if you call logging.basicConfig(filename=log_file_name) where log_file_name is the name of the file you want messages written to (note that you have to do this before anything else in logging is called at all), then all messages logged to all loggers (unless some further reconfiguration happens later) will be written there. Be aware of what level the logger is set to though; if memory serves, info is below the default log level, so you'd have to include level=logging.INFO in the arguments to basicConfig as well for your message to end up in the file.

As to the other part of your question, logging.getLogger(some_string) returns a Logger object, inserted in to the correct position in the hierarchy from the root logger, with the name being the value of some_string. Called with no arguments, it returns the root logger. __name__ returns the name of the current module, so logging.getLogger(__name__) returns a Logger object with the name set to the name of the current module. This is a common pattern used with logging, as it causes the logger structure to mirror your code's module structure, which often makes logging messages much more useful when debugging.

How to get log file path from logging after getLogger()

You can get the filename (if any) of the log file by inspecting the handlers of the root logger and checking its baseFilename

tl;dr

For example, update your example from this

import logging

logger = logging.getLogger( __name__ )
logger.info( "Logging Config Imported in Second Script" )

to this

import logging

logger = logging.getLogger( __name__ )
logger.info( "Logging Config Imported in Second Script" )

if logger.root.hasHandlers():
logfile_path = logger.root.handlers[0].baseFilename
logger.info( "Logging to File " + str(logfile_path) )

Solution

Actually, your instance of logging set by getLogger() can be writing to zero or many log files.

You can check to see if there are any log files being written-to by using the hasHandlers() function on the root logger (named logger.root).

logger.root.hasHandlers():

hasHandlers() will return True if there are more than zero Handlers being used by the logging object.

From there, you can get the list of handlers by iterating through the list of handlers in the root logger with logger.root.handlers.

In the example above, we just grab the first handler (the 0th item in the list).

logger.root.handlers[0]

I didn't find it documented anywhere, but if you check the code of the logging module's FileHandler class, then you can see it has an instance variable named baseFilename with the path to the file for the FileHandler

class FileHandler(StreamHandler):
"""
A handler class which writes formatted logging records to disk files.
"""
def __init__(self, filename, mode='a', encoding=None, delay=False, errors=None):
"""
Open the specified file and use it as the stream for logging.
"""
...
self.baseFilename = os.path.abspath(filename)

(source)

So just tack-on the instance variable named baseFilename to the end of the first file handler, and you get the absolute path to the log file

logfile_path = logger.root.handlers[0].baseFilename

Python, choose logging files' directory

Simple give a different filename like filename=r"C:\User\Matias\Desktop\myLogFile.log

How to get file the Python logging module is currently logging to?

logging.config.fileConfig('some.log') is going to try to read logging configuration from some.log.

I don't believe there is a general way to retrieve the destination file -- it isn't always guaranteed to even be going to a file. (It may go to syslog, over the network, etc.)

Logging to two different .log files

Once logging handlers are added to the logging.basicConfig they will still remain as they are even after you reconfigure them.

In Python3.8, there was a new force argument introduced in the logging.basicConfig. If this keyword argument is specified as true, any existing handlers attached to the root logger are removed and closed, before carrying out the configuration as specified by the other arguments.

That would make your function as -

import logging
import time

def example_request_handler(name):
logging.basicConfig(
level=logging.INFO,
force = True,
format="%(asctime)s [%(levelname)s] %(message)s",
handlers=[
logging.FileHandler(f"{name}.log"),
logging.StreamHandler()])
logging.info(name)

example_request_handler("first_request1")
example_request_handler("second_request2")

Above should log the second_request2 only in the second file.

Please note that removing/adding new logging handlers for every new log is generally a bad idea. If you want to log into n different files, then consider initializing different loggers for them and keep a mapping of loggers in a manager function which your individual logging functions can use.



Related Topics



Leave a reply



Submit