Poor Winston Royce. The guy’s big idea is vilified because of a popular misunderstanding of its meaning. The great irony of the Waterfall story is that Royce himself was trying to describe a more feedback-driven overlapping process.
But somehow, the Waterfall strawman itself became the reference model, presumably because it appealed to the authoritarian command-and-control mass-production paradigm of the American business culture of the 1970′s. In a triumph of absurdity, that very culture was about to reach its nadir of quality, productivity, and profitability, as the dawn of an era of humiliating ass-kicking at the hands of the Japanese lean producers had just begun.
The contemporaneous emergence of the “structured” paradigm with all of its top-down orientation probably only encouraged the enthusiastically misguided technology managers of the day. After all, wasn’t that the very promise of computer technology? That it would make predictions, automate production and accounting functions, provide managers with awesome new powers of control so that we didn’t have to rely so much on those pesky, unreliable workers anymore?
Of course, many serious thinkers about software development were uneasy with that direction, because few of them ever thought of the problem that way in the first place. The phasist project management model was imposed from without by a zealous and out-of-control management culture, desperate to assert social dominance in the face of mounting economic failure.
Like any big new idea, it took quite a long time for the structured/phased paradigm to fully diffuse through the profession. Consequently, it also took quite a long time for the profession to come back with a well-formulated reaction. The most forceful expression of that reaction was probably Barry Boehm’s Spiral Model.
The Spiral Model was not the revolution, it was the counter-revolution, but the momentum of the Waterfall/SDLC was such that it took a decade for the Spiral to be fully realized as an enterprise-class methodology, in the form of the Rational Unified Process (RUP). Now, for those who were paying attention, the elements of RUP had been visibly gestating all the while, it just took some time to build up the fighting weight necessary to challenge the reigning champion. But for the Object-Oriented faithful, it had always been a given that iterative development was the only credible approach.
The challenges before RUP were formidable. It had to simultaneously replace structured analysis/design methods and phased project management methods. But change was in the air, and the explosion of new technology and the resulting financial speculation suddenly made it look very uncool to hang on to yesterday’s business processes. The dysfunction of phasist project management was abundantly clear to anybody who was paying even the slightest attention, so the world was finally ready for a credible contender. On the other hand, a generation of middle managers would never accept a methodology that threatened the corpulent bureaucracy that would allow them to rest and vest through a finance-fueled stock market orgy, so it was in everybody’s best interest to make RUP as bloated and artifact-laden as possible. In this way, RUP was destined not to last, but it did serve to introduce a big idea into mainstream thought:
Maybe 5 iterations of 100 requirements is better than 1 iteration of 500 requirements.
Just like the asteroid killed off the dinosaurs in order to make room for the birds and the mammals, the dotcom bubble accelerated the retirement of a generation of middle managers and left behind a world less hospitable to the flourishing of heavyweight processes. RUP’s unwieldy bloatocracy had outlived its usefulness to the scrappy survivors of the Great IT Catastrophe of 2001. Furthermore, what you probably learned from executing 100 requirements in an iteration is that they have a funny way of multiplying into 200 requirements. Building 200 requirements at a time is not really that much more fun than building 500 requirements at a time, you just learn the truth a little faster (i.e. your budget is screwed and your quality will be awful).
Our faith in iteration remained undeterred, but something about the artifact-and-scope-driven approach of the 1990′s was clearly not working. Fortunately, while the Object Establishment was busy making the enterprise safe for the Spiral Model, another group was determined to continue to drive the idea to a more extreme conclusion:
Maybe 50 iterations of 10 requirements is better than 5 iterations of 100 requirements.
If 10 requirements typically take a team 2 weeks, maybe 50 iterations of 2 weeks is better than 50 iterations of 10 requirements.
My experience (and I believe that of many others) with scope-driven coarse-grained iteration is that it does not work well. On the other hand, the enduring popularity of Agile time-boxed iteration over the last eight years suggests very strongly that it does work well. The wheels turn slowly, but the champions of iteration were right all along. But this mixed history suggests a troubling question: why does 10 work when 100 doesn’t? What is a good size? What is the ideal size?
The answer to that question takes us right back to the beginning of our story. Back when the software development world was first trying to recreate the obsolescent mass-production culture of its day, the Japanese had already provided us with the answer. The ideal batch size is one. Over the last year, I’ve been operating software development kanban systems with and without Agile-style iterations. My goal as a lean workcell manager has been the realization of one-piece flow. What I’ve learned from my experience is:
In a well-regulated pull system, iterations add no value at all.
Just like RUP was a historical necessity to establish the legitimacy of the question, I believe that the first-generation Agile processes are a historical necessity to confirm that batch size is just as important to a development process as it is to a manufacturing process. And now that we know what the question really is, we find that we also know the answer. Iteration is only a transient concept to lead us to the deeper truths of pull and flow. The second generation of Agile processes will have no need for iterations as such. Continuous integration will be matched by continuous planning and continuous deployment. Live software systems will evolve before your eyes, and Little’s Law will rule the world.