Moving away from documentation and requirements gathering

No plan survives first contact with the enemy.

— Helmuth von Moltke the Elder (paraphrased)

It could have been the motto of the Agile Manifesto[AM01]. It could have also been said about Christopher Alexander’s post Notes on the Synthesis of Form[Notes64] work too. Much stems from developers repeatedly seeing their plans fall foul of poorly estimated costs. Overly complex systems often grew out of simple-seeming documents. Furthermore, customers were regularly unsatisfied with the results, even when presented with precisely what they had asked for. Using their antiquated design processes, modern architects also encountered the same obstacles.

The Agile Manifesto has an explicit preference for working software over comprehensive documentation. The value in not producing a lot of documentation up-front is much the same as Christopher Alexander’s methods, which required on-site presence and situational reviews. Plans have value, but they cannot be falsified by paper reviews. Reviews must happen at the place1 itself. Indeed, the capacity to reject designs in the early stages is the reason why mock-ups were so crucial to Alexander’s process. The preference for working software is a preference for something that can be inspected, understood, and reviewed in situ. Immediate, visceral feedback is much more potent than documentation when determining the next step.

The manifesto was also making a statement by asserting that the documents typically produced during development had no inherent value. Only documentation for an extant product was valuable to the end user.

A further problem with documents typical of the time was denial of the mastery, purpose, and autonomy of the programmer. Plans were orders—something to follow. The only choice was whether to do as instructed or remove yourself from the project. This echoed Alexander’s thoughts on master plans for site development.

[T]he existence of a master plan alienates the users … After all, the very existence of a master plan means, by definition, that the members of the community can have little impact on the future shape of their community, because most of the important decisions have already been made. In a sense, under a master plan people are living with a frozen future, able to affect only relatively trivial details. When people lose the sense of responsibility for the environment they live in, and realise that they are merely cogs in someone else’s machine, how can they feel any sense of identification with the community, or any sense of purpose there?

— Christopher Alexander, The Oregon Experiment[TOE75], p. 23-24.

However, as is often the case with an immature collective, things went too far. Documentation is for more than just the end user. It provides fertile ground for insights and elicits unexpected requirements. It offers a way to document how you arrived at your decisions and what informed them. We must also recognise that some specific forms of documentation are mandatory. Indeed, some paperwork is used to verify that we have achieved our expected outcomes and reached an arbitrary payment gate, while others may outline contractual obligations to security or safety.

People overlook the powerful effect of writing on understanding a problem. Writing it out often helps you to notice gaps in your knowledge or reveals contradictory beliefs. I studied design patterns to write this book, but the writing itself has also been an educational process.

These days, end-user documentation—the only documentation implicitly allowed—is considered an indicator of poor design as the UX design should make the application learnable without it. User manuals embody marginal value to the developer; a library should be well commented and easy to grasp, avoiding the need to refer to separate documentation.

The Agile Manifesto’s signatories showed no fondness for the documentation typically produced as a byproduct of the development process. Presumably, this was because it was not a product in itself. They appeared to support eradicating documents filled with gathered requirements and technical designs produced solely to be followed. There is sense to this. Planning the whole development up-front is a bad idea, but only because it’s impossible to foresee the future. Consequently, the plan will be flawed; preparing everything at the beginning is only bad if you force yourself to strictly follow the obviously and inevitably wrong plan.

So, why plan at all? Well, because planning is faster than simply doing. Planning what you’ll cook for dinner for a week can simplify the shopping and the cooking. Sure, things can change, but at least you have an overall idea of what you have to work with. When you have an idea of what to work with, you can balance the overall effort and cost of the operation. And this is what Christopher Alexander did. His process included a lot of up-front planning. They budgeted for parts and the selected patterns to use in the overall construction. His work in Notes on the Synthesis of Form[Notes64] is all about extensive up-front plans, but they are plans for helping reduce mistakes, not plans that had to be followed blindly. This is how we arrive at the thoughts behind the often quoted “Plans are worthless, but planning is essential.” The point of a plan is to limit the required effort to the minimum, not to stop thinking ahead entirely.

A preference for working software over comprehensive documentation has been interpreted as advocating for the removal of all requirements-gathering steps. However, this leads to software development without an initial phase to gather tasks, to figure out the complications, and to reduce risks by ensuring bases are covered. Does up-front requirements gathering decrease risk in practice, though?

Preparation versus risk relates to the theory of quantity over quality. How practice, deliberate or otherwise, makes you better at that activity. It allows for mastery and gives you new perspectives for making better decisions. It also explains how evolution wins every game ever played. Producing an order of magnitude more software products to show to the customer to get feedback rather than spending days, weeks, or months studying the customer’s requirements sounds very similar the tale of clay pots found in the book Art and Fear[ArtFear94]. People often repeat it as an example of how quantity beats quality with regard to the ability to create the best possible product in the end.

The aforementioned tale goes like this: the teacher announced the class would be split into two groups. They would grade the quantity group solely on the weight of their work. However, they would use the traditional grading process for the quality group by basing it on the quality of a single pot. The experiment had the most curious result. The highest quality works were all produced by the quantity group. The moral of the tale is thus: To deliver the best possible output, practical experience and many iterations trumps time spent in preparation and deep study.

I would like now to relay a personal story in keeping with this tale. In college, I studied Music Technology, a course that explored the technological foundations of music and other media in the modern age. It included many aspects of music, from royalties and copyright law to physically constructing a studio. Other workshops were more musical, and among them was a series of units on the composition and production of musical tracks. Music production was the reason I had taken the course in the first place and was the joining together of many disciplines. I wanted to make music using better tools and to learn better composition techniques and songwriting skills.

My music-making capabilities gradually improved over the time I spent on these units. I took each track one at a time, possibly releasing a single worthwhile piece of music each term. That’s three months per piece. I saw how I was improving with each new track and was glad I was progressing. However, we had to decide what our final project for the course would be. I am a game developer at heart and have always wanted to make them, so I committed to a project consisting of writing all the music for a fictional computer game.

This project was a tremendous change of pace for me as, without fully realising it, I had signed up to produce more tracks for this single project than I had produced in the whole course up to that point. My portfolio of songs was somewhere in the region of ten to thirteen at the time; I cannot remember precisely. Nevertheless, I decided to produce a track for each level of the fictional game, with themed music for each act and a general motif for the whole suite. I also followed some assumed guidelines for game music, such as no chorus or refrains, just a steady mood, and reduced dynamics meaning a player could set a volume level and never have to listen to silence for very long or worry about the music being louder than the sound effects they were listening out for.

With these limitations in place and the sudden need for more than twenty tracks, which all had to be composed before the project’s due date, I started work. As I worked, I decided which instruments were core to the pieces and which were track-specific. I built up a set of practices for constructing tracks and learned how the filters and effects I used would sound even before I applied them. Overall, the time it took to create each song grew shorter as the project advanced to completion. When it came to the final piece, it took me no more than an hour from first note to final mix.

This is the part where my tale reflects the story about the clay pots. Whereas in the ceramics workshop, the teacher graded the students by weight, I was due to be graded on production quality and my understanding and implementation of the techniques we had been taught in class. I admit that I had been a very poor composer when I started the course; over time, I had improved to the point where I was merely third-rate. However, after the game-music project, the teacher who graded me said the last few tracks were the best he had heard me create. In effect, the more pieces I had completed and the more in the zone I had become, the better the individual compositions were.

More consequential for me was how, after the project, my new production quality stuck. Today, I am a wretched composer due to a lack of practice, but the sudden improvement at the time meant any music I produced after the project was elevated to a new level. For this reason, I would argue that the story of clay pots is incorrect, to some extent. Some people interpret the result of the experiment as being related to agile methodologies, such as Scrum, when they’re not.

A Scrum-driven project does not aim to build many individual products, producing one great product in passing, almost by accident. It’s still a process aiming to produce one final viable product. It’s an iterative development process. Therefore, it sits in the camp of the primary group—the group graded on the quality of a single pot. Ultimately, my evaluation was based on the quality of my final suite of music, not on how many tracks I had produced. Therefore, what we should take away from the clay-pots tale should be that it’s not the pot that gets better, but the potter.

This is an important distinction because you only get one chance to make the final product in some projects. Perhaps it’s difficult or impossible to build a larger final product from a smaller one. Iterative development might not be an option. In such situations, if you don’t look forward towards the final product and what it should be, it can lead to nothing at all. When engaging in these projects, there are often insurmountable problems that are created in ignorance during early development. For instance, consider the cost and complexity of fixing security issues after your software’s first alpha or beta release. So what the clay-pots experiment tells us is that we can be better during these projects if we understand that practice and preparation are two things which can sometimes be one thing.

In conclusion, agile development can be forward-thinking and include up-front investments, just as Christopher Alexander’s team selected design patterns to structure their projects. It can be about learning deeply enough to remove the need to look forward. Agile principles will prioritise faster development to facilitate swifter learning as well as building up good tools to make future work more manageable, just as Alexander developed new tools and materials to complete his constructions. You shouldn’t expect to fashion a great composition on your first attempt if you don’t study, but if you intersperse your studies with a hundred creative acts, your last one will be better than if you had spent the whole time with your nose in a book.

Agile approaches are suitable for training your team to get things right the first time every time you task them with a familiar project. However, this form of development can be wasteful when attempting exotic projects. This is why we need to use models and prototypes, as there will be major mistakes and lots of technical debt. Christopher Alexander strove to use flexible materials to avoid the costs of these unknowns—materials where errors could be undone or avoided, even as they emerged.

Every process depends on the wisdom of the team to instinctively know the right thing to do. Agile methods allow them to do that in the same way as the distribution of knowledge via a pattern language. It also allows them to become better at their craft through accelerated experience. If what you are building can be built up iteratively, then all the better, and in software, it usually can. Do not fear throwing away the bad work and early attempts. In fact, you should be fearful of not throwing code away.

Rebuilding whole modules from scratch will become quicker as the team grows better at making them through practice. What you cannot do safely is rewrite a module you didn’t write yourself. Also, you should rewrite early, not late, as your wisdom will have already begun to fade.

This kind of evidence might not be enough for you. You want to know why. Why is planning up-front not as effective?

1

Much like a gemba walk is about literally walking in the place where the work happens, a review of a plan must touch the reality of the problem space.