Python Logging Module Gaps and Matches

I have a number of requirements for logging. This is how I’ve matched them up to the logging module.

  • Process related messages are emailed to a system manager email address.

    I’ve decided that I want to distinguish messages by level. So for this case any CRITICAL message needs to go to the system manager address. This means that I need an email handler with a filter that selects messages with a level higher than some limit. A normal default filter will do this, using setLevel() on the relevant handlers.

  • File related messages are emailed to a data manager address.

    Using the level concept, I need to identify messages with a level less than some limit. This is not a default test, so I need a custom filter that I can associate with another email handler.

    I could, of course, have used two different named loggers to make this distinction, and that might have been more in the spirit of the logging module. I havn’t got a particular reason for using message levels, but it seems to be doing the job so I’m not going to change it yet.

  • Messages for a single file are grouped together.

    This is easily achieved with a buffering handler. Messages are accumulated until the process finishes with the file, and then they are flushed out. The problem is that the logging module does not have any method to flush on demand other than closing the logging sub-system. I wanted to keep the sub-system open and act as required when processing a file starts and stops. To do this I need a way of getting the logger instance to scan through its handlers and flush if appropriate. Doing it this way keeps the logging sub-system initialisation in one place and allows me to use something like getLogger().flush() in the code.

  • The email subject line is set dynamically to reflect the current file.

    Setting meta data such as the subject line for an email happens when a handler is instantiated. The logging module allows handlers to be added and removed dynamically, so the default way of setting a dynamic subject line would be to add a suitable handler when processing starts, and remove it later. Doing this would mean that the code would need to know something about how the logger instance was constructed. The alternative is to have a logger method that updates data as appropriate. I can then use getLogger().do_something(with_this_text) and have the logger instance sort it all out for me.

  • All messages go to a common log file for basic record keeping.

    This is easy – I just have a stream handler as the root logger and make sure that all message by default go to a named handler. This second level named handler is set up when the program starts and I provide a set of wrapper functions that log messages to the defined name.

  • Messages are also posted to the Lokai database as actions that must be addressed

    This needs a special handler that is added to the buffering handler along with the email handler. That way, a flush of the buffering handler sends the whole message set to both email and database. The database handler needs to be written, of course, and it does whatever is needed to create activity nodes in the Lokai database.

  • Message statistics are provided for managing end of file actions.

    At the end of processing a file the program needs to know whether to commit results or rollback. Processing a file may result in a mixture of error and warning messages, so we need to know what happened. By default the logging module does not keep records of what it has seen, and a buffering handler does not provide access to its buffer for analysis. I decided to provide a modified Logger class that keeps a count of messages at each message level. The logging module allows me to set this new class as the default when instantiating loggers.

The next few instalments will cover some of the details, although, naturally, the real detail is in theLokai code.

Category: Development | Tags: , , Comments Off on Python Logging Module Gaps and Matches

Comments are closed.

Back to top