Archive for May 2011


Introducing Lokai

May 24th, 2011 — 6:38pm

Lokai has finally hit the streets. This is what I call a business process management package. That is, it helps manage business processes. To me, that means two things, activities with some traceability of actions taken, and documentation. There’s a tad more to it than that, of course. It must be possible to relate things to each other; it must be possible to limit people’s access (or, conversely, allow people to see the things they need to see); and it must be extendable.

The first two go hand in hand in Lokai. All data is structured around multiple hierarchies. Each node in the hierarchy can be documentation, activity, or anything else. That gives us the ability to create different structures for different tasks, so things are related to each other in a way that makes sense in the context. At the same time, that gives us the access control mechanism. A user allocated to a point in the hierarchy has access to all points below that.

Extendability is is a key factor in making this package useful, and the component-based approach taken in Lokai has already produced a special-purpose web shop. More will come.

Producing an open-source package is a challenge. I’m still trying to work out how much, and what level of, documentation to produce. I guess I need more users to ask questions before I can get a good grasp of that. More immediate, though, are the issues raised by using Lokai as the basis for its own web site. I want to give registered users access to tickets, for example, but, on the one hand, I don’t want to have to explicitly give permission to each user for each ticket, and on the other, I don’t want to give write access to the node that contains the tickets. As it happens, this issue of grouping people is clearly appropriate for any organisation or project with more than a hand full of users, so Lokai springs to life with at least one ticket to be resolved, and it will be the better for it.

I’m looking forward to seeing how others will want to use it.

Comments Off on Introducing Lokai | Development

Deploying packages using Fabric

May 15th, 2011 — 1:58pm

I’ve started using fabric to help manage my servers. This is a great tool that allows me to build a series of commands in Python and then run them on one or more server, all from my own desk. It all boils down to some handy layering on top of an ssh library. This is early days, for me, as yet, and I don’t actually have a lot of servers (well, two at the last count), but I do have some things I do quite often that require a number of steps, so a tool like fabric is a potential time saver.

My main use is for deployment of updated software. This happens in bursts, as and when the focus falls on one package or another, and it is helpful to have a pre-packaged set of commands that I don’t have to remember. Roughly, the things I need to do are:

  1. Identify the latest version of the package and find the distribution file in the development environment.
  2. Copy the distribution file over to the target system.
  3. Install the distribution
  4. Stop and restart the server.

Step 1 is where fabric starts to be useful. I can identify the version, find the target file and do any checking I need to do all using Python. I’m not afraid of bash, but I am happier in Python.

Step 2 is dead easy. fabric has a copy command, so no further though is required.

Step 3 is where life gets more interesting, as this happens on the remote server. I use the fabric run command to put together a series of commands that can be executed from an ssh command. The trick here is to make sure that the command has all the environment that it needs. I need two things: one is to be in the correct directory, and the other is to activate the correct virtualenv environment.

My servers all run bash, so I can string commands together using ‘logical and’, &&, thus:

command_1 && command_2 && command_3

The ssh command is executed in a minimal environment, so your .bashrc is not going to be run. That means that useful stuff like virtualenvwrapper is not necessarily going to work. In my case I have to explicitly activate the required environment:

cd my_package_distribution_directory && \
source ~/.virtualenvs/my_target_environment/activate && \
python setup.py install

fabric can automatically add the cd some_directory if you want. I haven’t found a way of adding any general prefix. No matter, because I can build the prefix easily and just have it lying around in the code, so I don’t need more in my bash-centric world.

Step 4 introduces another learning point. When restarting the server I need the server to detach from the terminal and run in the background. Normally, I’d simply enter the command terminated with &:

lk_fcgi_server_up.py &

This doesn’t work too well. Doing this remotely using ssh seems to leave ssh just hanging. Using fabric is no better, as that tool appears to terminate forcibly, thus killing the background process. However, bash (version 4.0) comes to the rescue with the coproc command. This runs the target command as a coprocess, complete with connected pipes for I/O. I don’t need to look at standard output from the server, so the simplest invocation seems to do the job:

previous_commands && coproc lk_fcgi_server_up.py

And that’s it for my deploy tool. I can use standard Python for command line options, environment validation and whatever else, while fabric manages the remote connections. One other thing I have learnt, however. If the remote command sequence is at all complicated, it might be better to have a command script sitting on the remote server that handles the whole thing from a single parameterised remote invocation. Of course, the remote script has to be maintained, and deployed on all servers …

Update: Just for clarity:

  • Using c1 && c2 && c3 & is not that useful anyway because it puts the entire line into the background and not just the c3 part.
  • When using coproc you need to add pty=False to the fabric run call.

Comments Off on Deploying packages using Fabric | Development

Package distribution using Distutils2 concepts.

May 12th, 2011 — 11:04pm

I have been developing a product I want to publish as open source (It’s called Lokai; more of this later), and, being keen, I popped over to distutils2 (aka packaging) to see what’s going on, with the intention of using it if appropriate. I’m not claiming to be following the leading edge here, it just seemed that the original distutils is seen as needing replacement, and setuptools good but flawed in some way (package management, for instance, see PEP 376). Distutils2 is also going to be the official packaging tool for Python 3, so it has to be worth the look.

We are warned, pretty much at every turn, that distutils2 is not yet ready for production. Actually, there’s a lot there that is good, but it fell over (version 1.0a3) on package data. The developers already know about this; it happens to be the example that I tripped up over.

Let me explain. Lokai is an application, rather than a library, and it is web based. It is quite sizable (for me, anyway) and I have some things I need from a distribution tool:

  • I don’t want to have to spend time maintaining a package list. There are 32 packages at the last count, and, while this is not overwhelming, I prefer not to have to fiddle with this every time I add something or restructure.
  • I use Mercurial and I want to be able to link the distribution to a tag in the repository. If this can be done with some automation, so much the better.
  • Lokai is a web application, so there are CSS files, templates and other bits and pieces that are not Python but are essential. I need those to be carried through to the distribution, and the ultimate installation, without undue hassle.

None of these things is particularly difficult, but they emphasize a fact that distutils2 seems to recognize well: there is a real distinction between actions taken by the developer to create a distribution and actions taken by a user when installing from a distribution. This comes about simply because the developer has access to information about the package (from a repository, for example) that is not available to the user; the user relies solely on whatever information the developer has provided. For distutils2 this distinction seems to center on the setup.cfg file. This file drives the whole process and there is no need for a setup.py file. The developer can use all the information available to generate a setup.cfg file, and that file is then all that the user needs to build and install the package. If you want to get more complex than this, of course you can, but this seems to me to cover a wide range of cases.

So, thinking ahead, I decided to write a program to generate a setup2.cfg file. I’m quite happy to help things along by putting files in reasonable places, so the bulk of this program comes down to this:

  • I use distutils2.core.find_packages to find all my packages from the top level. It’s easy enough to write such a thing, or copy it from elsewhere, but this does the job.
  • I found hgtools, which provides a way of getting a version number from tags in the Mercurial repository. I had to copy and modify it, because the implied version number structure is based on StrictVersion from distutils. My version uses distutils2.version.NormalizedVersion, which gets me a step forward in future compatibility.
  • The program contains fixed, or slow moving, values, such as package metadata, embedded in the source.
  • The output is usable by distutils2. (Or, at least, the version I was playing with. Expect changes.)

Of course, I’m not using distutils2, so the next step is to take the output from my prepare.py program (setup2.cfg) and feed it into a setup.py. The setup.py is relatively simple – it reads the prepared configuration file and converts data and data names as appropriate for a call to distutils.core.setup. This setup.py, plus setup2.cfg are included in the distribution.

I have, I know, inserted an extra step into the process, but I can automate my way around that if I want to. What I do have, however, is a set of tools to build the distribution that stay with me. I don’t have to include all the package searching and version calculation stuff in the distribution. I get to keep these in my repository, of course, but the user of my distribution does not need them, and, with this process, never sees them. I think that is ultimately simpler for all concerned. I’m also a step towards using distutils2 when the time comes.

Comments Off on Package distribution using Distutils2 concepts. | Development

Back to top