Over the last couple months, we've talked about joint practices and programmer practices. Joint practices pull all of the members of our "one team" together; everybody needs to do these all the time. The programmer practices detail the practices that programmers need to follow; while others may care about those practices, programmers need to live them daily.
This month we'll explore the customer practices and management practices. As with the programmer practices and joint practices, some of these practices weren't in the original XP list of 12. In the following sections, after each name, I'll parenthetically note whether the practice is new, unchanged, or maps to an original name. Note that these names are in flux, but the principles probably aren't.
The customer team doesn't care about programming per se, but they do care about the inputs and outputs. Customers determine which features should be in each release, what each of those features means, and whether the programmers can say they are "done" coding a feature.
Typical software development projects I've been part of specified requirements in rather formal documents that everyone referred to as "the requirements doc." XP is different for one primary reason: you rarely know all the requirements at the beginning of a project (and they change frequently anyway as the team learns).
Static documents that feel like contracts don't fit into the XP scene very well. XP teams learn as they go. This doesn't mean you shouldn't think before you start to code. It does mean you should be realistic about how far into the future you can think -- not very far at all. So, should we give up planning? No, but we need to plan differently. The release planning practice covers this, but with XP there's a prerequisite that involves a new way to communicate: story telling.
Think back to some of the requirements gathering exercises you've participated in. You might have conducted interviews, held rather lengthy meetings, and so on. You probably asked some clarifying questions and probed for more detail. At the end, you created a document and sent it around to everybody that mattered (and some people who didn't, but had to get the document anyway for political reasons). They were supposed to review it and sign off that you captured all the requirements. Did that go well? The only time it ever did for me was when the system was trivial, if then. Most of the people who had to sign off on the requirements felt nervous about doing it. They often said things like, "I'll sign off for now so we can proceed, but I'm not sure you captured everything." That made me nervous as I imagined the eleventh-hour revelations about critical requirements we missed. It simply wasn't a recipe for success.
The XP approach is radically different. It proposes that a team sit together in a room and talk about what the proposed system needs to do and how it needs to do it. Programmers or customers write these things down on note cards. Each note card says something like:
As a <some user role>, I need the system to <functional requirement> so that I can <business reason for the requirement>.
The stack of note cards becomes the set of requirements for the system, as they are known right then. The cards don't have exhaustive detail on them. Rather, each card is a promise for a future conversation to flesh out the details. In the meantime, the card simply needs enough meat on it to give the programmers an idea of how much work it will take to build it, and to give the customers enough detail to write an acceptance test (see Acceptance testing).
This form of requirements gathering requires a different skill from the business people who are the source of the requirements. In my experience, people in requirements meetings tend to throw any old thing out there just to go on record as having said it. That way, if something gets missed, it isn't their fault. This is bad for the team and is a significant cause of project failure. What we really need is requirements contributors who are good at telling stories. They tell them at the right level of detail, and they take responsibility for being the one to flesh them out when a pair of developers actually tries to code them.
Some people criticize this XP practice as too informal. I assume they mean that the requirements documents most projects have contain more formal language, each requirement has a reference number, and the requirements gathering process is more focused on arranged meetings with various stakeholders. In my experience, none of those things leads to better -- or more complete -- requirements. In fact, my experience tells me they lead to junk that's outdated as soon as it hits the page.
I have worked on successful projects with such formal requirements documents and requirements gathering methods, but my gut tells me the formal requirements document and process had nothing to do with our success. My teams still had to follow up on each requirement when it came time to code something. We still had to ask the source of the requirements to tell us the story XP says they should tell us to begin with. Why not shorten the process and start with the story? That's what this practice is about. Story telling is a skill, and some people are better at it than others, but everybody who contributes requirements to an XP project has to do it. Fortunately, everyone doing it can improve with practice.
I never much cared for the "game" part of the planning game. It didn't sound professional enough, not that there is anything wrong with having fun at work. Release planning is about making sense out of the stories programmers and customers tell. Release planning happens once per release.
Some people like to criticize XP as glorified hacking -- just a bunch of cowboys cobbling together a system without any discipline. Wrong. XP is one of the few methodologies out there that recognizes that you can't know everything when you start. Both the customers and the developers will learn things as the project progresses. Only methodologies that encourage and embrace such adaptability will be effective. Status quo methodologies ignore change. XP listens. The way it listens is through release planning.
The main idea behind this practice is to make a rough plan quickly and refine it as things become clearer. The artifacts of release planning are:
- A stack of index cards, each containing a customer story, which will drive the project's iterations
- A rough plan for the next release or two, as described in Planning Extreme Programming by Kent Beck and Martin Fowler (see Resources)
The critical factor that lets this style of planning work is to let the customer make business decisions and allow the programmer team to make technical ones. Without that, the whole process falls apart.
- Estimates of how long it will take to develop a story
- Cost implications of using various technology options
- Team organization
- The "risk" of each story
- Order of story development within an iteration (doing risky items first can mitigate risk)
- Scope (the stories for a release, and the stories for each iteration)
- Release dates
- Priority within each release (which features get developed first, based on business value)
Planning happens often, which provides frequent opportunity for either customers or programmers to adjust the plan as they learn new things.
As I mentioned in the test-first development practice section of last month's installment, acceptance testing is the customer half of the XP testing picture. In fact, I prefer to call these tests "customer tests" because they focus on who should be creating it. Customers make sure each development story has customer tests to validate it. Customers can write the tests themselves, recruit other members of their organization (for example, QA people or business analysts) to write them, or combine the two approaches. The tests tell them if the system has the required features and whether the features work correctly.
Ideally, customers will have the customer tests for the stories in an iteration written before that iteration is over. Customer tests should be automated and run frequently to ensure developers are not breaking any existing features as they implement new ones. Typically, customers will need some help from the programmer team to write these tests. On one project, we developed a reusable automated customer test framework in Ruby, which in my opinion is the best possible language for the job (unless I can figure out a way to do it in Smalltalk). This framework lets the customer specify "actions" that the system should perform. The syntax is quite English-like. The interesting part is that the test scripts customers write are executable Ruby code. The Ruby framework reads each script, executes each action, and spits out "PASSED" or "FAILED" for each. The framework handles Web applications and desktop applications written with the IBM Standard Widget Toolkit (SWT). Customers have seen the value quickly. My company, RoleModel Software, is planning on making the framework open source.
Not all of the customer tests have to pass all the time. The customer decides which ones are critical and which ones aren't -- that's a business decision. The point is that these tests help customers gauge how "done" the project is. They also allow customers to make informed decisions about whether something is ready for release.
I have a tendency not to want to let people see stuff I'm working on until it's polished. That's silly. Getting feedback early and often keeps me from wasting effort on something nobody really wants. Releasing frequently gets working software in the hands of real users so they can give you the feedback you need. Releases should be as frequent as possible while still delivering enough business value to make them worthwhile.
Release the system as soon as it makes sense to do so. This provides value to the customer as early as possible (remember that money today is worth more than money tomorrow). Frequent releases also provide concrete feedback to developers on what meets customer needs and what doesn't. The team can then include these lessons in its planning for the next release.
Why is this a customer practice? Because the customer team needs to drive what the programmer team releases. The customer team is the one that knows whether certain system functions are useful for users at any point in time. Also, making the customer team responsible for frequent releases keeps them focused on the reason you developed the software in the first place: to deliver a useful product that people will in fact use. The best way to do this is to let your users tell you what they want. Most often, they can't do that very well if they don't have something to see and touch. Give that to them and they'll give you the feedback you need.
You may have noticed that one of the original XP practices related to customer behavior (although it wasn't categorized that way explicitly) is no longer in the list: on-site customer. Customers are now part of the "one team." It is assumed they will be on site and close by.
In my own experience, I have found it vital to be close to the customer. Ideally, I'd like the customer to sit in my cubical area, close enough that he can overhear what the team is saying. That would allow us to ask a question immediately whenever it comes up. Unfortunately, this kind of intimacy can't always happen. The next best thing is to have the customer close by, which means the programmer team has to be located with the customer. It is almost impossible to use XP effectively when you aren't close enough to your customer to ask questions immediately, or at least very soon. We've tried to have a remote customer who is available by phone "anytime," but it just doesn't work as well.
When most people hear the word "manager," they probably think of somebody who tells people what to do. For a software project, management creates work plans, recruits developers for the "team," and keeps everybody on task, on schedule, and on budget. On an XP team, management is also part of the team, but in a different way than you (and they) might think.
Good managers rarely have to tell anybody what to do. Instead, a good manager points out problems that the team may not have noticed (because they're in the middle of it), then lets the team figure out ways to solve them. For many managers, this is a radically different way of thinking. As Kent Beck said in his paper on the revised practices:
This [practice] is a big shift for management, as it implies giving up control yet still being responsible for the team working effectively.
Having been a project manager, I know this approach sounds a little scary. But driving people like slaves -- or treating them like children -- sounds worse to me and is an approach that is likely to fail. It is very difficult to force somebody to do things your way. Instead, I would rather raise issues, make people confront them, and then let them figure out how to solve the problems. That way, you let developers and customers accept responsibility without abusing them.
When the Allied forces in World War II were pushing eastward through Europe after D-Day, one of the biggest factors that contributed to their success was splendid air cover to support the ground troops. The bombers and fighters pummeled the enemy and softened up the defenses so that the Allied ground troops had less work to do. While people who aren't on the project team aren't really the enemy, they can easily distract a team from its ultimate goal of planning, coding, releasing, and getting feedback. Management has to handle these interruptions.
There are also a lot of administrative and organizational things that get in the way of a team making software, such as signoffs for access to certain environments and getting broken workstations fixed. Management needs to remove those obstacles.
Finally, the team fits into an organization somewhere, and those other groups need to know what the team is doing and why. Management needs to be responsible for this communication.
Management should take part in the retrospectives I discussed in Part 1 of this series. The quarterly review is different. Its purpose is to get as much of management together as possible to make sure everyone (the team and managers) has all the information they require. This includes any managers who spend a lot of time with the team. It also might be wise to invite other people perhaps farther up the chain who may not spend as much time with the team, such as the manager's boss or the person paying for the project. Having a regularly scheduled meeting makes this kind of communication a regular part of life.
Perhaps management of an XP team needs to discover the lost art of leadership. Management needs to give the team an unvarnished picture of where they are at any point, through metrics, words, or both. If the developers or the customers get stuck, management can suggest ways to break the log jam by pointing out problems, making suggestions, giving advice, or providing encouragement. Only rarely should this practice include telling people what to do.
Without a pacesetter, the team is likely to burn itself out. Management needs to set the pace for the team. If people are working too hard, management needs to get them to slow down, go home, and get some rest. Tired people produce bad software. The number of hours isn't important; the level of fatigue is.
Management also needs to make sure the team has the time it needs to do what it needs to do. For example, many people consider refactoring a waste of time. In fact, keeping your code clean will allow you to keep your speed up; letting it get dirty will slow you down sooner or later. Management needs to make sure the team gets the time it needs for refactoring and other necessary tasks.
XP can feel odd when you start. This is because it requires each member of the team to think differently. Sometimes you'll want to quit. But the change takes time and concentrated effort. It is not something that just happens. Much of the change needs to be conscious. Next month's installment will help you understand how to begin to make the shift in thinking that is required to make XP work.
- Participate in the discussion forum.
- Read all of the articles in the Demystifying Extreme Programming series.
- Read the original "XP distilled" (developerWorks, March 2001).
- Check out "What Is Extreme Programming?" by Ron Jeffries for even more information on what XP is (and isn't).
- Download xUnit testing tools at XProgramming.com.
- Read about the economics of pair programming in "The Costs and Benefits of Pair Programming" (PDF) by Alistair Cockburn and Laurie Williams (XP2000 submission, 2000).
- If you want to learn more about XP, be sure to pick up a copy of the books referenced in this article:
- Extreme Programming Explored by William C. Wake
- Extreme Programming Applied: Playing to Win by Ken Auer and Roy Miller
- Planning Extreme Programming by Kent Beck and Martin Fowler
- Extreme Programming Explained: Embrace Change by Kent Beck
Roy W. Miller has been a software developer and technology consultant for almost ten years, first with Andersen Consulting (now Accenture) and currently with RoleModel Software, Inc. in North Carolina. He has used heavyweight methods and agile ones, including XP. He co-authored a book in the Addison-Wesley XP Series (Extreme Programming Applied: Playing to Win and is currently writing a book about complexity, emergence, and software development. Contact Roy at firstname.lastname@example.org.