Yesterday I was involved with a workshop around agile development at scale. At one point in the conversation we started talking about the relationship between cost and quality. Some of the people in the workshop were relatively new to agile and still believed the traditional theory that to build in high quality it costs more, sometimes substantially more. This does appear to be true on traditional waterfall projects, but some people were making the mistake that this was an "natural law of IT" which also must apply to agile project teams. I naturally jumped on that idea and described how agile developers have found that writing high quality code leads to lower development costs and shorter time to value, in direct contradiction to traditional theory. A few people struggled with the idea for a bit, and one was pretty adamant that in some cases the need for very high quality does in fact lead to greater cost and time. He talked about his experiences on large-scale Rational Unified Process(RUP) projects and in particular how some URPS (usability, reliability, performance, and supportability) requirements can increase your cost. At this point Per Kroll, co-author of Agility and Discipline Made Easy: Practices from OpenUP and RUP, jumped into the conversation and pointed out although higher quality does lead to lower cost in most cases, using Toyota's lean approach to manufacturing as an example, that the agile community didn't completely have the relationship between quality and cost completely correct. My spidey sense told me that a learning opportunity was coming my way.
Per and I had an offline discussion about this to explore what he'd been observing in practice. In most situation it appears to be the case that higher quality does in fact lead to lower costs and shorter time for delivery, something that Per and I had observed numerous times. This happens because high quality code is much easier to understand and evolve than low quality code -- the agile community has found that it is very inexpensive to write high quality code by following practices such as continuous integration, developer regression testing [or better yet test-driven development(TDD)], static code analysis, following common development conventions, and agile modeling strategies. When you "bake in" quality from the start through applying these techniques, instead of apply traditional techniques such as reviews and end-of-lifecycle testing (which is still valid for agile projects, but should not be your primary approach to testing) which have long feedback cycles and therefore prove costly in practice. But, as we've learned time and again, when you find yourself in more complex situations of Agility@Scale sometimes the mainstream agile strategies fall down. For example, in situations where the regulatory compliance scaling factor is applicable, particularly regulations around protecting human life (i.e. the FDA's CFR 21 Part 11), you find that some of the URPS requirements require a greater investment in quality which can increase overall development cost and time. This is particularly true when you need to start meeting 4-nines requirements (i.e. the system needs to be available 99.99% of the time) let alone 5-nines requirements or more. The cost of thorough testing and inspection can rise substantially in these sorts of situations.
In conclusion, it does seem to be true in the majority of situations, which is what the level 1 rhetoric focuses on, that higher quality leads to lower development costs. But at scale this doesn't always seem to hold true.
PS -- Sorry for the corny title, but a couple of days ago at the Rational Software Conference I had the pleasure of interviewing Jamie Hyneman and Adam Savage from the Discovery Channel's Mythbuster's show as part of the conference keynote. They're great guys, BTW, who have had a really positive impact on motivating children to be interested in science (apparently kids like to see stuff get blown up, go figure).[Read More]