Pagina's

Wednesday 25 May 2011

The Scrum Way (Training with the master)


I recently took on a new business challenge, which, until I learned the error of my ways, I described to myself in shorthand as 'moving from using scrum to coaching scrum by helping others to use it'. Although mistaken (I will return to the nature of my error before I sign off), this idea was valuable in that it made it clear to me that making a start on getting my certification was worthwhile on several levels. When it turned out that the father of scrum, Jeff Sutherland, regularly came to my home town of Amsterdam to run the Scrum Allianace's Certified Scrum Master course, a prospect practical and prudent became positively pleasurable.

A few short weeks after making initial inquiries, just after 9 am on a Monday morning, I was among the few dozen students listening eagerly as Jeff settled everyone in at the start of the two-day training with the tale of his recent trip to Japan. While there he met Ikujiro Nonaka, who, along with Hirotaka Takeuchi, had written the 1986 Harvard Business Review paper, “New New Product Development Game”, that had directly inspired scrum. On the same trip, some Japanese scrum masters taught Sutherland to see scrum in a new light; it was not a development (or project management) framework, it was a way of doing and a way of being, in short, a way of life!

As you might expect after an introduction like that, Jeff (ably assisted by Nicole Belilos of Xebia) ran the training like a scrum; they started with a modified planning session and maintained a course backlog throughout the training, sessions were time-boxed and topics estimated, feedback gathered actively along the way was immediately actioned in the form of updated priorities in the course backlog, velocity was measured and they were also toying with a metric based on 'aha erlebnissen'.

One of the first topics handled was Shu Ha Ri:


This seems to be a hot topic at the moment in the scrum community, where much of the discussion centres on what the different phases constitute and how to translate the concept into action. I suspect that this might be missing the point somewhat. At any rate, at least as important in the idea of shuhari is the (implied) quality of the master or sensei, simply observing the master at work can be informative and instructive and a true master will inspire by their very way of doing things.

“Scrum the scrum!” Sutherland repeatedly regaled us throughout the course, “where did this idea come from that scrum only works for software development?”.

My most important eureka moment came as a delayed reaction, a few days after I had completed the course: I needed to live the scrum way! Getting back to my faulty shorthand; I needed always to keep in mind that the best way to teach scrum is to do scrum, and do it well. Up to that point, in putting flesh on the bones of the aforementioned business opportunity, my colleague and I had navigated a sort of envisioning phase using a malformed Kanban board and somewhat flexible time-boxing (oxymoron?). Although we had produced some value along the way, including a mission statement and a set of objectives, we had ultimately failed in generating a product backlog, and were unsure of next steps. Applying my newest insight I realised that if the context of a scrum is foreign enough it might be useful to consider initially adopting the shu stance by default and installing the entire scrum structure unquestioningly.

It worked a treat, assuming the shu stance delivered much needed focus. For instance; the product backlog is an essential artefact of scrum, one way or another, we had to produce one. This turned out to be a great starting point. To produce a product backlog we needed to assume that it was possible to break the services that we had envisioned into small enough chunks such that we could 'build' complete increments of them within a single sprint. To realise value and feedback as early as possible we also needed to aim for a viable 'release' within 2 or 3 sprints. With this mind-set adopted, a beautifully timed and wonderfully insightful suggestion from my girlfriend tipped whatever point it was and things really began to fall into place. We now have a product backlog and are approx. halfway though our first strictly time-boxed 5 day sprint. We are, so to speak, well on our way, with humble thanks to the master.

Wednesday 18 May 2011

A common objection to evolution theory (Can emergent design work?)

One of the deep problems that both supporters and detractors of the theory of evolution wrestle with is, effectively, the problem with emergent design. How did things get to be the way they are? More accurately, given the enormously complex and intricately interdependent interplay of fully formed, specialist agents in today's world – how did life get from there to here? If life started as unicellular organisms (or even further back as amino acids), how can something as complex and apparently well-designed as a human being, with all its well-designed component bits and pieces, emerge? To cite a commonly used example, how could something like the eye emerge?

Whatever mechanisms are at play, we can be sure that there was never a time in the long history of life on earth when eyes, or hearts and lungs and livers for that matter, were self sustaining organisms, able to look after their own survival and evolving in their own right up to the point where they were useful to some other organism and were co-opted. That means that somehow, on the long road from eukaryotes (prokaryotes in fact) to human being - hearts, lungs and livers, etc. evolved along with the organism; that is to say, we started as a minimum viable product (self-metabolising unicellular organism) and iterated from there to here.

Easily posited, but, in the case of the cardiovascular system for instance, errors will be selected on mercilessly, intermediate forms of the lungs, or heart, are even harder to imagine than intermediate forms of the eye. If we examine our most recent ancestors in the tree of life or glance sideways at our cousins (chimps, bonobos etc.) and other close relatives we can see this confirmed – there are no substantive differences between cardiovascular systems current in mammals, including seafaring mammals like dolphins and whales.

But there are land animals whose respiratory & circulatory systems differ substantively from ours; insects have a so-called open circulatory system, they have no lungs and do not breathe as we do, and their hearts are very different to ours too. Insects have small holes in the sides of their exoskeletons (spiracles) that allow air to enter a system of tubes known as trachea. The trachea extend to all parts of the body, terminating as the thinnest of capillaries in the tissues where the oxygen will be delivered. The tissues are flooded with the rest of the fuel mix ('blood'; food, water and other essentials) as the heart only 'pumps' (by peristalsis) so far. This all means that the biochemical magic of metabolism in insects resembles so many tiny generators firing away on-site.

Mammals and insects are indeed extremely distant relatives, having diverged on the tree of life when vertebrates and invertebrates diverged, but I ask you to bear with me because an open circulatory system as such can act as a credible model for an intermediate form on the long journey from there (unicellular organisms) to here (human beings with closed circulatory systems). Nature, self-organising between limits and tipping points, acting recursively though selection, effects any transition through inexorable cumulative change over many many generations. We of course can be a little more direct, even if our method takes its cue from evolution. Thinking in terms of building software:

Expanding the generator(s) metaphor, we could define our architectural metaphor as replacing all those on-site generators with a single big generator working with a couple of heavy-duty pumps, whereby the fuel pipes have been exapted as power lines to deliver directly consumable energy to the cells.

Then we would start small: by de-coupling a single cell from its own power source for instance. To achieve that for that one cell, we would already need to have modified a small section of fuel pipe so that it could act as a power line, and have moved 'power generation' to outside the cell. We would also have had to protect the rest of the system from possible problems (given that they would probably be fatal!) by surrounding our experiment with tests, in short, our first delivery would have to represent a walking skeleton of our solution, a minimum viable feature.

Thereafter, iteration by iteration, through continuous inspect and adapt and constant re-factoring, our design is refined simultaneously both piecemeal and as a whole, emerging along the way in perfect tandem with the (details of the) goal that we are discovering.

Wednesday 11 May 2011

Comparing Rugby and American Football (Feature teams vs component teams)

Given that they share a common ancestor, the general contours of rugby and American football are quite similar. The goal of both sports is to get the (oval) ball into the opponent's end zone (try or touchdown – most valuable score). In both sports it is also possible to kick goals. In the specifics of the two sports however there is much significant variance, e.g. in rugby it is always illegal to throw the ball forward whereas in American football one forward pass per play is allowed.

It is not my intention to compare and contrast the (relative) merits, or otherwise, of rugby and American football generally, but rather to investigate the 'efficiencies' realised by the approaches to specialisation current in the respective codes. Aptly enough, given that Scrum got its name from rugby, we can view rugby teams as analogous to feature teams and American football teams as analogous to a collection of component teams:

In (professional) American football it is common to have an offensive team, a defensive team and a specialist team (e.g. for kicking plays). There is very little overlap in the personnel of these teams, a quarter back never lines out in defence. As a consequence, no one team can effect the complete set of possible plays. As such, when the ball changes hands in American football, play usually stops to allow both sets of players to be replaced.

An obvious advantages in this is that, for example, a defensive tackle can concentrate on breaking the offensive line, and achieve dizzying excellence in that one aspect of play. An offensive tackle will specialise in blocking players attempting to get at his quarter back, expending very little effort on learning how to break lines...

There are of course specialist roles in rugby too, roughly speaking a rugby team is split into backs and forwards, whereby the forwards contest and control the ball and the backs move the ball. Two special positions are further instrumental in directing play and providing a link between the forwards and the backs; the scrum half and the fly half. The fly half also usually takes care of all of the place kicking.

Rugby players therefore have very specific jobs to do but are regularly called upon to help the team effort in ways not defined by their role (generalising specialists). One of the advantages to this approach is that players can improvise; should the fly half receive the ball during a play designed to provide a goal-kicking opportunity, wherein he sees the envisaged opportunity blocked, he can still choose from a range of possibilities, including options beyond his usual remit, to further the play (sometimes to even greater advantage than originally planned).

Assuming that quality at the highest levels in both sports is comparable, and that points scored are a reasonable measure of 'value' delivered (to all parties; participants, fans, investors, etc.), we could define efficiency as follows: points scored/(play time * squad size)

The average number of points scored in NFL games through the 2008 season was 44 (source) A similar figure for rugby games is 38 (source). In a single game of American football (60 minutes play time) up to 90 players may be actively involved, although there are only ever 11 players per team allowed on the pitch at any one time. This yields a rough scoring efficiency of 0,008. In a rugby game there are a maximum of 44 players involved (both teams start with 15 players and can make up to 7 substitutions), and play time is 80 minutes, which results in a scoring efficiency of 0,01 – 25% greater than the throughput for American football.

If we consider that an American football game usually takes about three hours to complete (due in part to the regular substitution of entire teams), compared to a rugby match which normally completes in well under 100 minutes, the efficiency gap yawns even wider... In that case the feature teams (rugby) could be delivering value at up to 4 times the rate of the collection of component teams (American football).

Wednesday 4 May 2011

Mother Nature as exemplary engineer (Why does Scrum work?)

Evolution in a (super simplified) nutshell is: descent with modification, acted upon by natural selection.

That is to say, organisms pass properties from one generation to the next via some process or other. This process is never perfect and therefore each new generation of a given organism will differ slightly from its parent generation. These differences will be selected upon by the environment; changes that offer some advantage to an organism in a given environment will eventually spread throughout a population (survival of the fittest).

However, most (significant) modification will be disadvantageous, certainly if change is only achieved by (random) errors in the copying process. This should mean that almost every trial will end in (fatal) error. In that case, even the 3 billion or so years that have passed since the first unicellular organisms appeared on earth would not represent near enough time to produce life as we know it today in all its complex diversity.

A 'recent' discovery in physics helps address this problem; complex adaptive systems self-organise.

What?!?

A discussion of this phenomenon of self-organisation, and/or complex adaptive systems, is way beyond the scope of this piece - not to mention how monumentally challenging it is to your author's intellectual capacities - so I will just give a (hopefully compelling) example to illustrate the phenomenon.

Consider the termite colony. Termite mounds are built without a site director or a project manager or in fact any kind of management structure. How do all those termites, scurrying around in their own little restricted worlds, so obviously manage to work together as successfully as they do? They, that is to say, the complex adaptive system that is a termite colony, self-organise(s).

This tendency to self-organise introduces another possible source of variance in evolution. The environment selects on these experiments in self-organisation in precisely the same way as it does on the changes in the genetic code of the individual organisms, blindly but steadily (Mother Nature literally takes aeons to realise her handiwork), weeding out that which does not work well in favour of that which does.

All well and good, but what good could all this possibly be to us in software development?

Well, if a few extremely smart software engineers were to get together to see how they could best build good software, they might consider how Mother Nature goes about her work. If it was possible to speed up the process of selection acting upon modifications they might even have been on to something...

Well, a bunch of software engineers did go through an approximation of this thought process, and they came up with... Scrum!

Scrum ( in a super simplified) nutshell is; self-organising units engaged in constant experimentation, the results of which are acted upon by (short) formal feedback cycles at multiple levels

In short, Scrum works because it is a modified copy of how that greatest of all engineers, Mother Nature, goes about her job.