Playing the numbers
I’m fascinated by some recent, popular economics-type books that try to root out the hidden reasons behind certain things that people do. Backed with a discipline in economic theory, authors of these books pore through stores of overlooked data looking for unexpected incentives that explain why people sometimes do unexpected things in certain circumstances. Once those unexpected incentives are revealed, a lot of seemingly odd behavior starts to make a lot more sense.
One example that was given was the act of giving blood. Data showed that when people were offered a stipend for their donation, less blood was given. On the surface that didn't seem to make sense -- why would people give less blood when given money for their trouble? What the study revealed was that the money made the generous act of donating blood seem more like a bad way to make a few dollars. This revealed the hidden motivation that, since giving blood involves a bit of pain, people will happily do it if it's considered charity, but not if it equates to a low-paying job.
Using economic theory to explain how people interact isn’t all that new. Of course, Adam Smith defined the modern science of economics 235 years ago with The Wealth of Nations. He said that "self interest, specialization and trade" were the correct answers to the question, "How should people best work together?" My guess is that for most of the following 234 years, many have tried to find hidden incentives to explain why people often didn’t behave as Smith’s science predicted.
What is new is the availability of all the supporting data. It’s fun when clever people can draw big-picture conclusions from small bits of information collected as part of daily life. As the quality of the underlying information gets better, it follows that the quality of the conclusions that are drawn from it will be better, too.
While I appreciate the search for unexpected reasons to what motivates people at large, I’m more interested in a specific group: my team, and a specific part of the economy: my software development project. In this case, knowing how people work is actually a part of my job as a technical leader, and it’s certainly the job of my team’s project manager. For any leader, it’s important to find ways to relate to people, or at least understand when people aren’t working together effectively.
The problem with understanding them is that today’s extended software development teams have more diverse channels for interacting with each other than in the past, and as a result they are being asked to interact in new ways. People perform multiple roles and teams are more spread out geographically. New agile methodology requires close inter-personal contact to complete a job. What’s more, it’s likely that I won’t get to know my teams through direct face-to-face contact. Instead I’ll probably work with them mostly through conference calls and Web meetings. Just when the demands on my work collaboration are at a peak, the approach to doing the job has fundamentally changed and gotten a bit more complicated.
I think that understanding my team is an important part of making the new approaches work, and I want to get this understanding without resorting to some un-provable pop-psychology theories on motivation or interaction. Smith and economic analysis did this by using statistics and other more scientific approaches to determine cause and effect. Even though I joke that it’s hard to actually apply any science to what a lot of people do most of the time, myself included, these economic factors do provide a workable model to figure out what folks are up to, and some of what they come up with is funny and unexpected.
The payoff in understanding all of this is getting my little corner of the economy -- my software development team -- to thrive. Since I’m trying to create a tiny, little "economic hot-zone", I want to get the mix right. I can try lots of things to keep the engines of commerce humming. I can invoke the "state charter" of corporate values. I can try the "tariffs" route by reminding everybody what the rules say we’re supposed to be doing. (That usually incites some form of "trade war," where the team reminds me what I’m supposed to be doing.) But you can’t lose with a call to everyone’s individual self-interest, applied with a little specialization and trade.
So to do this right, with as little guessing as I can, I need what the economists used: the numbers, or the metrics. IBM has recognized the need for tracking and metrics with its internal teams, which use agile methodologies. I have worked in agile environments in the past, before supporting tools and metrics had caught up with us. Once they were introduced, I found that good collaboration tools can foster the cross team interaction that my agile environment requires. Even more, if I don’t have good metrics on how that interaction is going and some kind of underlying process to go with it, I won’t know where the project is actually going, and I won’t have a crack at finding out if hidden pressures and motivations are driving my software development off track. It adds the science to my theories on what’s going on. It’s the numbers that make it a science, and science that makes it work.
As Smith himself once said, "Science is the great antidote to the poison of enthusiasm and superstition." (The Wealth of Nations)
Of course, there are other important factors for managing and aligning team goals. Perhaps your personal magnetism draws your team to follow you wherever you lead, and no understanding is required and no tips from this article are needed. But for the rest of us, I’ll focus on the tools that can generate good metrics, and use that to derive good insights. (Don’t worry, I’ve been using technology to cover gaps in my abilities for a long time, so I know what I’m doing.)
First, I’ll explain why I use "economics" to make this point about software development teams, then I’ll use some team roles, show examples of what I mean, and explain how you can align them to your team’s goals. I’ll talk about what I can’t fix, but I’ll show how some good collaboration tools can help to track and measure how teams are performing (even in an agile environment) to provide measures for the insight you want.
Intention versus reality
A lot of people do things for the "right" reasons. You can count on them to do good, important things most of the time. I, for example, recycle my trash to do my part in saving the planet. But sometimes things don’t get recycled. For example, I have to rinse my son’s plastic yogurt containers first before tossing it in the recycle bin or else my garage will soon smell like an old cheese factory. But sometimes, especially if I’m just too tired to rinse out them out, I just throw them away. My self interest in not wanting a smelly garage sometimes outweighs my motivation to save planet Earth.
In the same vein, some of the folks I work with on IT professions genuinely and tirelessly work to advance the profession, and sincerely help their protégés to advance their careers. Working with them is a true pleasure. But I’d be naïve to assume that they don’t have other pressures that means sometimes that some things just don’t get done, just like me. And sometimes it’s the good intentions that get reported rather than what actually happens.
In general, self-reporting of any data is unreliable, and I want to remove the temptation for people to "game" the system by reporting what they want me to hear. The same goes for specialization and trade. I might assume that the natural order of things causes my team to specialize and "trade" skills and work as a natural order of the process, but I don’t want to assume too much. I hope, for example, that my best "tools" guy is providing useful scripts and widgets to the junior folks, and helping to keep the configurations of our complex development environment working. But it helps if I can actually see that, and also perhaps see where I need to have him ply his trade more effectively. More likely, I also want to see if those junior folks are thrashing around, re-inventing those tools in a less efficient manner.
While I’m not an economist, I can certainly do what they do and look at the numbers to tell me what’s really going on. Even in agile environments, where the team interaction is fast and informal, good metrics can help me understand what might be going on with my team if my best laid plans aren’t working, and give me a shot at correcting that.
It might be a stretch to call it economics, but the approach is similar: understand self interest, align it (in my case) with my team’s goals, and make sure I’ve got good specialists and trade of skills and development artifacts. I know this will work because, while most companies are still striving to become data driven, a study has found that the IT companies that best use their numbers are the most successful (Wall Street Journal, see Resources). I want to be one of the successful ones.
As Smith put it, "It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest." (An Inquiry into the Nature and Causes of the Wealth of Nations)
Good examples of bad economics (from someone who’s done them)
Every role on a development team has different pressures and motivations. I know this personally because I’ve had lots of different roles, depending on the needs of the project at the time. Using myself as a test subject, here are some of the roles I have had, some things I know that can derail me from my good intentions, and metrics that can help prove it:
Table 1. Role: Technical Team Lead
|Good, self reported goals:||Get the project done on time, under budget. Build a track record of success and innovation.|
|Hidden, unreported pressure:||Deal with extreme time crunches by shortchanging things I should be working on, and overextending myself by doing things I shouldn’t, such as taking over a task when I should be delegating it.|
|Metrics that can show the incentive||Zero input comments to an architecture review for a new element, which requires my help, yet a significant amount of check-ins and check-outs by me, rather than the members of my team, which means that the component-cost just went up. Thrashing begins on the critical path of the design, since I didn’t fully understand it from the architecture review!|
|Solution to align goals:||Allow additional time for orientation between projects, and take into account that the up-front costs of doing so will be less than the back-side cost of not doing it.|
Table 2. Role: Staff Developer
|Good, self reported goals:||Code efficient modules to meet project requirements.|
|Hidden, unreported pressure:||Misunderstanding of another developer’s changes, or the project architecture has led to an unbuildable module that they modified in common.|
|Metrics that can show the incentive||Code mergers of check-ins lead to intense rework with other developer. A large number of defects increasingly comes out of just a few components.|
|Solution to align goals:||Involve me or similar team mentor to coach these developers in the design of this element, as it’s obviously becoming too costly to maintain, even in development mode.|
Table 3. Role: Software Tester
|Good, self reported goals:||Improve the quality of the software by identifying bugs, helping developers resolve said bugs.|
|Hidden, unreported pressure:||Pressure from project manager to execute complete test plan leads to glossing over certain bugs, conversely an overly complex software solution (see above) causes "severity creep" of trivial bugs to monumental severity.|
|Metrics that can show the incentive||Number of blocking defects, percentage completion of test plan, number of blocking defects in non-critical path components.|
|Solution to align goals:||Ensure IC and training to ensure critical path is understood by all members of the team to prevent them focusing on the low-value parts of the application, driving up the cost on parts of the application nobody cares about.|
Table 4. Role: Senior Software Developer
|Good, self reported goals:||Passes down institutional knowledge for standards, compliance, culture. Helps to foster team innovation.|
|Hidden, unreported pressure:||Doesn’t spend any time documenting his work, because he’s spending all of his time in conference calls telling people about the work. Feels writing it down may compromise the direct need for his skills.|
|Metrics that can show the incentive||No intellectual capital (turnover docs, how tos, and so on).|
|Solution to align goals:||Make IC creation a part of his job metrics. Track the IC creation to show that it reduces overall project cost and duration.|
You get the idea. For every role, there’s something that can throw you off the path of economic efficiency. The "economics" of it says that people will respond to the "invisible hand" of the economy without even knowing it. This happens no matter what I ask folks to do or what they think is the right thing to do. I just want to catch those kinds of things before they become a chronic problem and spoil the project.
Not just data: good data (that comes from the process)
Now, there are many barriers to productivity: poor planning, bad tools, unmotivated team members, ineffective leadership, and others. Technology can’t always fix these. But what we can fix is the technology.
What Adam Smith did, and what other economics guys do, is look at what the data shows that comes as a result of the process itself, not some artificial mechanism that might be thrown off by factors he didn’t understand. Like all of us, he may have had expectations about what was going on, but if the data didn’t support it, he changed the theory. That’s the science part of economics: looking at what the results really say.
If you don’t read metrics, you can’t tell what‘s working and what’s not, and you can’t fix anything. One of the key principals of the Capability Maturity Model Integration is that for process optimization, you need to have metrics to best understand where the process choke points are. You then use those metrics to remove barriers and start the cycle again. You can’t improve your work processes if you don’t have numbers to support where the bottlenecks lie.
This is true even though I‘m using agile methodology. Being agile doesn’t mean you don’t track anything. It’s more about avoiding extraneous work that’s not part of software development. That’s why I want the metrics to be an extension of the processes I follow to get the job done in the first place. Others have written about the metrics that make sense for agile teams. I believe that the metrics can support and improve the people aspects that agile requires.
Of course, too much raw data by itself can lead to overload, which actually can diminish your effectiveness (Newsweek, see Resources). A key is to have metrics that cut the clutter and yield to good decisions.
Also, to be clear, I’m not talking about collecting personal information here. I don’t want or care about anything that tracks personal data about my team. I’m only interested in the items related to the job, and if they’re like me, they want that kind of information tracked, because they’re trying to eliminate inefficiencies just like I am.
The tools to make it work
OK, so I’ve established that numbers are important, and the right numbers are better. So what technology do you use?
At IBM® we have lots of good collaboration tools. We use IBM Lotus® Sametime® for chat and IBM LotusLive™ for Web meetings, and we’re certainly providing good utilization to our telecommunications providers through lots of teleconferences. Those are important tools and they do have some audit capabilities that I use.
But the primary tool we use to track it all is IBM Rational® Team Concert™, which is a good example of the natural flow of metrics from how we work anyway. Rational Team Concert is based on the Jazz™ platform, which is designed at its core for collaboration. One of the key elements of Jazz is that it’s a tool that’s central to all parts of the collaborative process, and is designed to support those processes without getting in the way. Any metrics generated come from naturally using the product.
Rational Team Concert has both Eclipse and Web interfaces, so it fits in with existing tooling. It’s designed with geographically dispersed teams in mind and works for any process or methodology you choose. It provides process templates to help you get started, and is integrated with your source control management solution. Best of all for this discussion, it can provide meaningful tracking of these interactions without requiring extra work on the part of the teams. Since it’s central to the existing processes, you get these benefits without a lot of additional work.
What do the numbers track?
In my opinion, Rational Team Concert starts its tracking in the right place: with the team. From the first e-mail invitation to join the project, Rational Team Concert tracks who’s on the team, enables others to automatically get invited, and provides links to team standards and IC repository documentation.
The central measure of development for Rational Team Concert is the work item. Work items include defects, features, change requests, and similar tasks that are used as part of the software development process. Rational Team Concert provides tracking, audit trail and approval tracking of these items, along with ad-hoc reporting. There’s help in assigning a work item to a category of work, and images can be included in the descriptions, all to make working the work items easier. You can customize them to better support your process, and you can tag them to make it easier to find them.
There’s a lot of built-in support for agile planning, including help with sprint planning and a task board to support daily scrum meetings. There’s support for risk assessment of completing work items, and graphics to track team load and progress.
Rational Team Concert includes a component-based source control system. It’s extremely helpful to correlate a work item with the code that was actually changed to make it happen, and Rational Team Concert does that. It also integrates the team’s build system to provide awareness and control, which reduces the likelihood of failed builds that your testers can’t use.
Rational Team Concert can integrate with other tools, including requirements tools and other SCM tools. Having a Web interface and support for multiple platforms, including Eclipse, Microsoft® Visual Studio® and IBM System z® gives it a wide audience.
The highlight here for me, though, is the reporting. Right away, I can get 50 customizable team reports that provide the metrics from just using the tool. I can also access work item reports that track trends on new work items, closed, sprint burndown and release burndown, which are extremely useful to tracking my teams. All the reporting comes packaged with the same tool I use to collaborate on getting the task done. It’s transparent and automatic, no further collection is required.
Using the numbers to enhance the economy
Using these numbers, I can begin to see and remove barriers for my team. Even better, I don’t need a doctorate in economics to do it.
For example, what if I see too many work items opened up in my non-critical component? That could mean that the team is neglecting the core aspect of the project or is otherwise unfocused on the peripherals. What if I see a lot of defects coming out of one specific component? That could mean that that component is too complex or maybe I’ve got too many developers in there conflicting with each other.
What about intellectual capital? What if I don’t see any being tracked? Maybe I need to see what my seniors are doing to backfill, or make sure that common tasks go to the best person -- and that he’s got enough time to figure it out. What about aligning those aspirations?
The point is, I can see what the team is actually working on, compared to what they tell me or what they think they should be working on. And better yet, this tracking can show that the initiatives that we care about are actually making the process better sprint to sprint, and make it part of my team culture. If I can track the non-code related items, I can measure those sparks. I can track velocity, see what teams are working, and think about those technology fixes (or maybe non-technology fixes) to coach those teams along.
You can see that economics takes into account how things really get done. You can’t know what people are thinking, but you can see how they’re acting (and if it’s efficient) by using good collaboration tools with built-in metrics that are tailored to how your team is organized and how your processes work. You don’t want a lot of unreliable self-reporting data, but rather something that gives you the insight as a natural part of your agile collaboration.
I’ve held up Rational Team Concert as the tool that naturally encourages collaboration and generates good metrics routinely. You can then use those metrics to keep things going. It’s true that you can’t fix everything with technology, but if you don’t measure what you’re doing, you can’t fix anything.
Smith said that by using his approach "a workman, even of the lowest and poorest order, if he is frugal and industrious, may enjoy a greater share of the necessaries and conveniences of life than it is possible for any savage to acquire." I’ve lived through many "savage" times, without the direction provided by useful metrics in my process, and it’s no pleasure. I am happy to abandon my savage ways. Smith talks about the "invisible hand" guiding what we do, and around here, I put every hand to work. With the right tools that are a natural part of my team’s processes, I can put the invisible hand of economics to work for me.
- Rational Team Concert product information
- Rational Team Concert Information Center
- "What Makes a Company Good at IT," by Andrew McAfee and Eric Brynjolfsson, The Wall Street Journal, R3, April 25, 2011
- "I Can’t Think," Sharon Begley, Newsweek, March 7, 2011
- "Freakonomics," Steven Levitt and Stephen Dubner, William Morrow, 2005, p 20 (citing "The Gift of Blood," Transaction 8 (1971) Richard M Titmus)
- IBM developerWorks WebSphere