Making the Python logger module do your bidding

One of the things that Lokai supports is the automatic download and processing of data. For this type of operation logging is clearly important, both to inform people of problems to solve, and to keep records. In Lokai, the logging module has been brought in to service these various needs.

The basic automation concept is that files are placed in a directory, and a process comes along and loops through the files until none are left. For logging, we need to recognise that there are two groups of people involved.

One group, the data managers, are interested in the files themselves: was the file processed, were there any errors in the content of the file, are there any actions arising from the content. As far as possible, this group would perhaps prefer that all errors or actions for a single file are reported in one pass, rather than having the process stop on the first sign of trouble. More, the reported errors should be grouped into a single message to reduce email traffic.

The second group, the system managers, are interested in the correct operation of the process. In this case, the process might stop if there is a system or programming issue.

For both groups, the logs must identify the actual file being processed.

To cover the various needs of these groups Lokai provides a number of more or less configurable facilities:

  • Process related messages are emailed to a system manager email address.
  • File related messages are emailed to a data manager address.
  • Messages for a single file are grouped together.
  • The email subject line is set dynamically to reflect the current file.
  • All messages go to a common log file for basic record keeping.
  • Messages from file processing are accumulated and emitted as a single message for the whole file.
  • Messages are also posted to the Lokai database as actions that must be addressed
  • Message statistics are provided for managing end of file actions.

To put this into some sort of context, a file processing program might have this type of structure:

initialise_logging()

try:

    for file_to_process in get_next_file():

        set_email_subject("Processing file %s" file_to_process)

        for line in open(file_to_process):

            ...

            log_a_message(text) # As appropriate

        if log_messages_have_errors():
            abandon()
        else:
            commit()
        flush_log_messages()

except:
    log_a_message(error_text)
    flush_log_messages()
exit()

Obviously, this leaves out a great deal, but it serves to identify what type of calls we might want to make. Much of this is quite straightforward for the logging module. On the other hand, logging does not support dynamic update of meta data, so set_email_subject is something new. As it happens, we also have some work to do do to make some of the other things happen behind the scenes. My next post will look at the features of the logging module to see what might be done.

Category: Development | Tags: , , 2 comments »

2 Responses to “Making the Python logger module do your bidding”

  1. Python Logging Module Gaps and Matches — Mike de Plume

    […] have a number of requirements for logging. This is how I’ve matched them up to the logging […]

  2. A ReportingLogger class for Lokai — Mike de Plume

    […] that we have looked at logging requirements and the logging module it’s time to look at some code. This is the basic Logger class that I […]

Back to top