The art and science of IT enablement

from The Rational Edge: Read how software consultants can apply an objective, measurable method to the process of improving the organizational culture of a business. By analyzing several factors, including motivation, fiduciary visibility, and strategic alignment, the author shows how positive change can be effected. This content is part of the The Rational Edge.


George Spencer, Principal Rational Consultant, IBM, Software Group

Author photoGeorge Spencer is a principal consultant for the IBM Rational services organization. For thirty years he has created and practically applied improved software techniques and software management practices in a wide variety of markets. At IBM, he focuses on building on an intimate understanding of the client's offering and the people who provide it through hands-on involvement with training and pilot efforts. Prior to joining IBM, he worked on software innovations for industry and the military. He created and advocated the winning technology behind President Clinton's Technology Reinvestment Program, and he has also offered unique math-based solutions in the financial securities market, resulting in sponsorship of a new business initiative by Fidelity Capital Markets and Voyageur-Capital.

15 August 2007

Also available in Chinese Russian

"The objective of science is to measure what is measurable, and make measurable what is not yet so." - Galileo Galilei

illustrationAs mentors, we often discuss the elements of change. Some would argue that the true motivation for sustaining process improvements is through the science of objective business measurement. But often, gaining visibility into a client's historic cost and schedule overruns, productivity, and product efficacies is difficult; that knowledge is just a little too intimate for new clients to share. Besides, revealing that information in and of itself does not inspire the client to change.

Some argue that mentoring a client toward adopting process improvements is an art, just as providing any guidance inspiring a change to human behavior demands artistry of sorts. For example, how does one inspire?

A well-thought-out approach should both inspire and sustain. The following are my thoughts on how best to mix this art and science of mentoring, so that the process truly becomes that of enablement.

The starting point of all achievement

"The starting point of all achievement is desire. Keep this constantly in mind. Weak desires bring weak results, just as a small amount of fire makes a small amount of heat." - Napoleon Hill1

As Hill suggests in that statement, a client must have a strong desire to change if a great achievement is to become the downstream result. But you can't count on that desire being present, not at first.

If you're going to improve a customer's business through process improvements, begin by listening. Listening is by far the most powerful tool the process mentor2 will have in the early days of an extended project.

You can't suggest -- not immediately, at least -- that your clients need to profoundly change the way they work. They may not even realize how much time and money they're wasting, and you cannot be sure that you have all the answers, anyway. So adopt a more humble approach. Listen. Ask questions.

Below I offer some questions that will allow you to assess the "cultural landscape" you have just entered. But first, let's take a look at a general map you might use to locate the client's business at you begin your engagement.

Figure 1 shows "The Success Landscape." It represents the two dimensions of motivation and visibility. Motivation is judged based upon the degree to which one wishes to improve one's business. Visibility pertains to availability of and access to the information needed to credibly motivate others. By ascertaining placement on this notional landscape, those involved with improving process will better understand what needs to happen to avoid the common pitfalls.

Illustration showing two axes: fiduciary visibily and strategic versus tactical

Figure 1: The Success Landscape

The vertical axis ranges from strategic to tactical. A strategic involvement is one focused on the direct needs of business. Profitability, costs of production, and cost of maintenance are all examples of strategic business objectives. These represent areas where the highest aspirations of change might be located. Business leaders usually consider success in these areas the most exciting and farthest reaching objectives of a corporation.

By contrast, tactical involvement is focused on the mechanics of how business is conducted. To many, limiting their aspirations to these less inspired goals is typically brought about by the perception of risk, lack of awareness of true business needs, avoiding a confrontation with advocates of the status quo, or simply the perceived safety of maintaining "one's station." Certainly a tactical understanding of how to do things is essential, but ultimately only in the manner that they affect the strategic objectives.

Locating the aspirations of the company's point of contact, workforce, and management places them vertically on this landscape.

So how do you make an assessment, in order to place a client on this landscape? The seasoned process mentor may ask questions such as:

  1. What types of changes are you envisioning?
  2. What is your motivation to change?
  3. How broadly is this motivation embraced by management?
  4. How broadly is this motivation embraced by contributors?
  5. How does this affect your company's objectives as stated in the last annual report?

As stated before, the mentor must be listening. The only wrong answers are the ones that do not reflect the client's true feelings. So the wise mentor will avoid leading the customer, until later. Furthermore, these questions are better delivered in person or over the phone. The preparation of a written response may cloud the truth with formality. It is important to know where their hearts are.

The response to these questions can be plotted3 on the vertical axis, as shown in Figure 2.

Illustration shows answers to questions helping determine how strategically motivated an organization is.

Figure 2: Judging motivations: Strategic to tactical orientation

These example answers shown in Figure 2 range from a tactical desire, "We want to build good code," to a strategic desire, "We want to increase profitability." The closer the alignment of the client's desires to their company's business desires, the more strategic their thinking.

Along the way, even the most sensitive mentor in the process of assessing corporate culture will encounter their own questions from various sectors of an organization. "What are you doing here?", "Why are we changing?", and "We really don't have time to meet" are not unusual responses. Every mentor knows the hard realities of an enablement unsupported by local management. This assessment is a good harbinger of such issues by examining the coincidences and gaps between the motivations of your client's primary point of contact4, management, and the workforce. Figure 3 shows a typical alignment of motivations, as an example of such a distribution.

Illustration locates the strategic to tactical alignment aligned with management layers

Figure 3: Typical alignment of motivations

The example distribution of motivations in Figure 3 might be described as follows:

  • The client's point-of-contact POC (C) is highly inspired, thinking about fulfilling some fundamental business need, but is unaware of the corporate objectives. Often these POCs are motivated to have their organization achieve some broad level of process competency, such as CMM.5 But not knowing the overall corporate intention -- the "why" -- makes it difficult to prioritize early adoption efforts.
  • Management (M) desires to document the "New Process" which is a listing of improved tactics from some process library or expert. The attitude "We just need to know what to do for this new process" is much less inspired than being a vital part of fulfilling company growth or profitability.
  • And the workforce (W) is motivated only if it involves their current responsibilities. This is often in keeping with the reward system in place.

Again, the process mentor is not effecting change at this stage, only listening; the intent is to figure out where the players are starting.

Making measurable what is not yet so...

Does the team share the visibility via measured results that they can point to and thus incontrovertibly demonstrate the success of a new process? Much in the way that Galileo's science swept away mysticism and superstition, measured improvement can sweep away unproven opinion during the course of process adoption.

In every company there are individuals (typically officers) who are entrusted with the financial performance of the corporation. Their performance is typically measured by their contribution to growth, profitability, and outlook. These involve managing the cost of development, cost of operations, productivity, and product efficacy in the market.

The next series of questions address visibility into these fiduciary concerns. They start with the simple questions about what the players know, and end up with the more difficult questions concerning what the client is willing to share:

  1. Do you track time for individuals?
  2. Is this tracking on a per project basis?
  3. Do you keep records of the number of errors requiring correction?
  4. Do you keep records on customer complaints and questions?
  5. Do you keep track of actual verses planned expenditures and budgets?
  6. Do you measure productivity?
  7. What is your average cost/schedule overrun?
  8. What is your productivity?

Illustration shows range if fiduciary visibility from low to high.

Figure 4: Visibility into accomplishments

The answers to these questions place a client on the horizontal axis which describes visibility into the fiduciary performance, as shown in Figure 4.

Fiduciary performance will establish a credible basis for demonstrating improvement. And so, the extent to which those metrics exist in the first place -- and the degree to which visibility is extended to the point of contact and is ultimately extended to you, the process mentor -- helps expose the means of credibility evolving and sustaining improvement. The alignment of the corporate culture regarding fiduciary visibility can be graphed along a continuum, as shown in Figure 5.

Illustration shows a range of fiduciary visibility

Figure 5: Typical alignment of visibility

The typical alignment of visibility, shown in Figure 5, may be explained as follows:

  • The client's management (M) is measuring some metrics such as expenditures on projects, but is unsure as to how those costs break down within those projects.
  • As perhaps this information is being maintained "close to management's vest" the client's point of contact (C) is only aware of the number of bug reports and enhancement requests from clients.
  • Perhaps the client's point of contact, taking a lead from his management's style, is similarly unwilling (U) to share this information with process mentor.

Let's pause a moment to contrast two of the clients listed above. The first hired an outside consulting firm, at some expense, to assess the efficiency of their development process. the report was delivered; but somehow, after two months, the final report "cannot be found." The second client just started to use time cards. And although they don't have a history to review, they are most willing to share whatever numbers they have. Approaching these engagements was quite different, and I bet you can guess which one was more successful.

The outlook for success

Why are these two sets of questions so important? John Kotter reports that only 15% of corporate transformations are successful.6 How can companies expect to achieve the commonly touted understanding of "bottom lines" such as profitability, productivity, and growth wthout an inspired effort and broadly visible metrics?

Considering our landscape, let's look at the typical reasons for failure to adopt improved strategies, as illustrated in Figure 6.

A scatter plot diagram shows a variety of risks to improvement

Figure 6: What is the outlook for success?

On the bottom of the graph in Figure 6, a lack of a strategic desire undermines short-term buy-in to the improved approaches. Attitudes expressed by "Why use use cases?" or "What's so great about describing design-agnostic scenarios?" reveal that the organization is not steeped in meaningful strategic business objectives; thus there's a lack of urgency in the minds of those expected to adopt these goals.

Similarly, on the left, a lack of visibility into the real metrics that motivates the corporation means that the change effort will never be able to factually prove success to the policy makers. Hence the ability to sustain improvement is similarly undermined.

A typical example of an engagement is shown in Figure 8. This real-life example involves the following players:

  • The process mentor (U), who knows the strategic potential of process improvement, but is denied visibility into the metrics that can tangibly demonstrate these improvements. The inability to demonstrate improvements -- whether or not they occur -- will undermine client motivation across the board and deny feedback for meaningfully evolving the process.
  • The principal point of contact at the client (C), who also understands some of the strategic potential, but is similarly denied visibility.
  • The workforce (W), which has little interest in business objectives, and certainly no visibility into the specifics that measure their success. Focus here is largely tactical, as the concern is limited to how to address the job at hand.
  • And finally, the client's management team (M), which has reasonable visibility into the company's fiduciary concerns, but has not embraced the strategic potential of process. Perhaps they are more concerned about simply defining a process rather than connecting aspects of this process to these financial goals. And of course, without this awareness, there is no way for management to communicate urgency to the workforce.

Graph shows points of interest plotted in three quadrants

Figure 7: Moving toward success

So the objective of the mentoring engagement is to address these realities of the current organization by putting in place an approach that will, with time, converge all participants in the strategically minded, high visibility, highly motivated quadrant.

With such an achievement, participation will become inspired and the subject of ongoing refinement into the future.

The first decision: Tactical or strategic engagement?

Having considered the forms of improvement and the degree to which an organization may need to improve, the first step for the mentor is to decide whether this is a tactical or strategic engagement.

The distinction is important. A tactical engagement is driven by a desire to introduce a reasonable process on a limited basis, and often subjectively gauging the results. Business objectives may be generally understood, but are not directly tied to shaping the practices being explored. In some cases, the goal of a tactical engagement is inspire a sustainable strategic engagement.

A strategic engagement has defined business objectives that are well understood by those responsible for corporate financial performance, such as the board of directors and the company's officers. Practices employed are specifically aligned to these goals, and metrics are gathered to objectively gauge their effectiveness. The goal for a strategic engagement is to institutionalize process in a manner that will continue to fulfill business objectives as the supporting technologies and needs continue to evolve.

So how does a mentor decide?

Sometimes the answer is simple. If the principal point of contact at the client is effecting change within a single department, has no hope of effecting visibility into performance, and is not tasked nor inspired to pervasively affect improvements, then it's tactical.

Sometimes the answer is not as clear. In judging their ability to adopt a strategic engagement, the mentor may examine the client's ability to fulfill the obligations stipulated a readiness plan.7 The big question for the client is, "Are you ready to strategically institute process improvement?" The following list of topics and related questions helps determine strategic process readiness:

  1. Willingness to start: Is the company willing to stand up for a software development organization accountable to the board of directors for guiding, gauging, funding, and evolving the company's development processes?
  2. Empowerment through access to data: Is this new software development organization empowered and willing to state definitive business objectives that mirror publicly stated corporate concerns pertaining to the fiduciary goals of the organization?
  3. Credible approach: Is the software development organization at the point where it can credibly explain through fact rather than authority, how specific process improvements can now, and will continue, to evolve to fulfill these business objectives?
  4. Authority to control process: Does this software development organization have sufficient backing to subsume, align, or differentiate other process initiatives within the company?
  5. Strong process leadership: Is the software development organization, at the beginning of this effort, willing to define and publish the business objectives for process adoption to the company officers? Is the software development organization willing to stand up for the merits of these changes to the workforce?
  6. A shared, singular vision: Through review and amendment, is the collection of principal process stakeholders capable of converging on a common set of business objectives to drive process?
  7. Authority to influence change: Has the software development organization been granted the appropriate authority and support to influence the manner in which work is conducted through both providing the appropriate incentives, and the outward advocacy of front-line and delivery managers in the work force?

Getting to seven solid "Yes's" requires credibility. Companys that have not insightfully examined the processes, and only take a summarized, bottom line view, typically do not understand why their process is ill. (See Figure 1.) Credibility for approaching process improvement is something that must be earned, and without access to control the workings and measurement of the discrete elements of process, gaining credibility can be quite a hurdle.

For example, an insurance company recently was experiencing unacceptable cost and schedule overruns. They also had no means to measure rework and could not differentiate between requirements churn and implementation issues. Given the lack of visibility, the resulting prescription was limited to: "Work harder!" and "Once signed off, it will take an act if God to change the specification."

The issue was that the business specifications were changing because they were not adequately completed and reviewed. This, in turn, was due to a tradition of including GUI design and other encumbering design issues in those specifications. The true nature of this situation was never realized because of a lack of visibility and granularity into measured performance. Improvements could be made by, for example, separating behavioral specifications from design specifications, and measuring the degree of change of each separately; in this way, they might realize that perhaps poor performance resulted from either of these possibilities:

  • The business analysts were not settling on a common understanding of how the system was to behave.
  • Behavior was well understood, but the design teams were not settling on a comprehensive approach -- in other words, the organization lacked a well-thought-out architecture.

Affecting sustained change requires an institution inspired by visible company goals, a shared vision, the authority to institute, process experience, a credible approach, visibility of measurement, and strong ongoing leadership. The seven "Yes's" speak to why process improvements are difficult, and often require an outside catalyst -- one with built-in credibility and process experience for bootstrapping positive change.

As many companies are not willing to start by jumping into full-scale process improvment, the first goal for the mentor is to begin to move the organization toward these seven "Yes's." And it begins with the tactical engagement.

The tactical engagement

The purpose of the tactical engagement is to begin to explore the efficacy of improved process within the organization. Although internally the business objectives may be broadly known, the general experience of the process mentor and hands-on exposure to ongoing projects will govern which practices are followed.

The intention is to begin to expose the merits of process improvement: either gaining visibility or beginning to construct awareness into the fiduciary measures of performance. At the same time, the mentor is making the workforce and management more aware of the exciting possibility of reaching more strategic goals, as shown in Figure 9.

Figure illustrates organizational change toward improved visibility

Figure 8: Tactical Engagement: Inspiring improvement

Figure 8 portrays a typical example of this desired shift in thinking. Communication between management and the workforce is required to bring the team on-board, or in this graphic, converge the dots. Fiduciary visibility, the topic of this conversation, provides some initial insight into the measured symptoms of having a poor process.

So how is this done practically?

The mentor needs the support of management, which needs to directly express these desires to their workforce. Once an appropriate introduction of the mentor's efforts is understood, the mentor is to provide very direct guidance on how work is to proceed.

The mentor attends meetings and discussions, drives and explains new work processes and artifacts, resolves issues, suggests new management practices, etc. In general, the mentor influences what he or she can in light of personal experience, passion, and the constraints of the workplace.

And when it comes to judging results, the mentor is always on the lookout for numeric and anecdotal evidence of improvement. Avoiding the "art of office interpretation" is best achieved with direct hands-on involvement, and non-nuanced status reports that grab people's attention. Figure 10 offers one such example from a recent client.

Figure shows tabular data regarding progress

Figure 9: Typical tactical scorecard
Click to enlarge

Figure 9, along the left, under "Success Criteria," introduces several somewhat broad goals. These are tactical goals, but what is important is that they represent needs that are commonly perceived among the team. This recent client illustrates:

  1. An improved and documented design process that:
    • Results in more complete business specifications
    • Results in improved designs
    • Is consistently employed
  2. Adherence to an improved quality assurance process
  3. Reduced development costs

In the next column labeled "Key assessment item," the process mentor has identified specific tactical accomplishments:

  1. Develop a clear statement of Vision
  2. Business needs based on system Usage (Specification is based on Use Cases)
  3. Stakeholder concurrence based on review of Vision and Usage (Vision statement and Use Cases)
  4. Directing Project based on Business Needs (Use Cases used as milestones in project plan)
  5. Using business needs to drive architecture (Use Case Scenarios form basis of Architecture)
  6. Defining architecture using cross company standards (UML -- Sequence and Class Diagrams)
  7. Driving design based on Architecture (UML -- Class diagrams)
  8. Driving GUI based on structured analysis of usage (Activity charts)
  9. Assessment of results based on business accomplishments (System tested against Use Case Scenarios)
  10. (Re)shaping project direction based on lessons learned from actual construction and testing of executable code
  11. Establishment of meaningful business metrics to gauge performance

Notice how item 11 begins to lead the team into strategic thinking

Periodically the team is reviewed. Each column on the right shows another such assessment and demonstrates progress over time. During a tactical engagement, this is typically a subjective review by the mentor, resulting in three possible outcomes for each assessment item:

  • Green: Cost-effective process is being followed
  • Yellow: Uncertain whether cost-effective strategy is in use
  • Red: Cost-effective strategy not being followed

As these artifacts are from an actual engagement, they reveal much about this particular team. For example, the scorecard in Figure 10 shows that the team quickly understood and faithfully completed most of the activities the mentor outlined. However, as the engagement proceeded, notice how getting "Stakeholder concurrence based on review of Vision and Usage" was delayed. One may suspect -- as the stakeholder's were more accustomed to judging the final outcome, rather than participating in the work in progress -- that the team was apprehensive about exposing a work in progress.

Although this iterative exposure to the stakeholder community was difficult, it was most important for this project, as it is for any project. But in a tactical engagement such as this, the greatest contribution of this exposure is gaining acceptance of the overall process from this executive community.

Also notice that change happened from the vision down to development. It is easier to fight a downhill battle than attempting to influence change from the bottom up. For example, how can one encourage the development of use cases when a common vision has never been written down nor embraced? We were fortunate on this engagement to have management support and insight in picking projects still in their formative stage.

Notice also, the one issue that was not resolved was item 11, "Establishment of meaningful business metrics to gauge performance." During this engagement, this growing awareness led the client to think of grander strategic objectives. Such a message needs to be part of the persistent and pervasive message on all these tactical activities.

With each assessment a short summary with anecdotal information is shared. And each is the newest installment in a "Book of Success," as illustrated in Figure 10.

Figure shows mentor's notes

Figure 10: Mentor's notes with tactical scorecard

In the actual example in Figure 10, the team is being informed of situations and events considered significant by the mentor. Notice that Item 1 tells the story of how the new "Design Agnostic" focus curtailed what would have become a significant change to the scope of the project. Although the change was internally advertised as a simple technical change, in reality it required a new set of behavior and a change to the overall project vision. The business manager commented that such a change "would have taken months!" And of greatest significance was that this change was effected by the new heroes in the company, following the new approach. The mentor said nothing, and took much pride in his students.

This is one of many such examples where the mentor must encourage, discover, and advertise. It was made possible by one of a collection of practices against which the project teams was assessed, a collection that may at times require some evolution.

Specific practices may not fare well with certain clients in certain industries. There may be internal or external constraints, real or imagined. The mentor must be willing to evolve how practices are tactically carried out, as illustrated in Figure 11.

Figure shows relationship of performance, practices, and capabilities

Figure 11: Tactical evolution

Figure 11 shows how this process involves identifying practices, as itemized above, providing the infrastructure (training, tools, and talent) to enact these practices and in some simple manner judge their effectiveness (which can be rated according to the example scorecard shown in Figure 9).

Here, the "Institutional Practices" are less formally established as they map to business needs in a less formal manner, unlike those for a Strategic Engagement shown in Figure 15. Similarly, judging compliance typically begins as subjective, given the lack of visibility into, or perhaps even the nonexistence of, sufficiently refined fiduciary data.

During a successful tactical engagement, well-delivered guidance will be followed, teams will progressively adopt the new practices, progress will begin to be observed and perhaps measured, and examples of success will be realized. The typical process-related fear of pedantic adherence to fixed and encumbering rules will be replaced by the sense that process is something that is shaped by experience and real success.

But most importantly, an awareness of the true potential of process improvement will include an awareness of its ability to approach greater business goals. And after a while, the client may turn to the mentor, as they did in the above example and ask: "What's next?"

The strategic engagement

The purpose of the strategic engagement is to institutionalize process change. And I use the term "institutionalize" for a specific reason: A new or renewed process institution is required.

Too often company management thinks of process as something that is written down once and then simply followed. As mentors, we have seen a good many software process and policy books gathering dust on the shelf.

The workforce knows that pedantic adherence to fixed rules rapidly falls by the way side in the light of tight schedules and limited resources. And considering the constant change of business needs and technology, what is today's improved process may well become tomorrow's impeding process.

To "Institutionalize process change" means putting in place the appropriate organization(s) with the appropriate authority and responsibility to effectively enact, sustain, and evolve process over time. It means that the individuals in this organization:

  • Are directly involved with process definition and regular work contributions to the projects they serve
  • Are accountable to the process group for ensuring the inspired advocacy of process, while evolving the process based on both measured and subjective project feedback

And although some may nod in agreement to this notion, the organization's motivation to grant this "authority" and "responsibility" may well fall short of what is actually required.

And so, unlike tactical engagement, there is an "entrance exam" for a strategic engagement. Called the "readiness plan," it establishes the intentions of all involved parties. A well-crafted readiness plan answers the questions itemized earlier.

Figure shows team evolving toward high strategy and visibility

Figure 12: Strategic Engagement: Institutionalizing improvement

As shown in Figure 12, management, the workforce, and the principle point of contact within the client must be inspired to effect strategic company objectives, and all must share in the visibility to measure their accomplishments.

The practical commitment to institutionalize is embodied in the publishing and successful advocacy of a readiness plan that:

  • Identifies the key process stakeholders in the organization. At a minimum it includes the chief financial officer, chief operations officer, chief information officer, and the new process group leader.
  • Creates and defines the role, responsibilities, and authority of this new process group.
  • Establishes the roles and responsibilities of management in communicating their support for this new mission to the workforce.
  • Establishes the roles and responsibilities for the outside mentoring organization, and the expected transition or process ownership to the new, in-house process group.
  • Identifies the business objectives and institutional practices that support them.
  • Sets forth a process whereby these objectives and practices shall be evolved.

To summarize, process institutionalization requires creating a new institution. This comes to no one's surprise in any traditional field of engineering. For example, at the highest level of management, manufacturing concerns address the fiduciary impact of adjusting the manufacturing process. The board rooms of insurance companies discuss the fiduciary impact of adjusting the claims process. Similarly, executive officers whose corporations have a major stake in information technology are beginning to address the fiduciary impact of the particulars of the software development process.

And the result of such high-level concern is to establish business objectives for the process being managed.

So what are these business objectives? Recall the final question in Figure 2: "How does this affect your company's objectives as stated in the last annual report?" Good objectives should be well aligned to corporate objectives, and quantify some measurable achievement. Typically there are two or three such objectives and they are simply stated. Examples are detailed in Figure 13.8

Figure shows a list of business objectives

Figure 13: Typical business objectives

These business objectives constitute the principal vision for process improvement. And next in this order are the institutional practices that enact them.

Institutional practices are a set of behaviors that describe the process of development, independent of the available infrastructure (techniques, tools, and talent). Like the business objectives, they represent a desired goal independent of how that goal is achieved.

It is recommended that such practices be partitioned "one to many" for each business objective. If a practice addresses more than one objective, it is restated. In each case, the practice describes how it fulfills that unique objective.

Figure 14 shows an example based on the first objective itemized in Figure 13. For this particular client, there were roughly ten practices for each business objective, resulting in about forty-three such practices in total.

Figure shows institutional practices listed for first business objective

Figure 14: Typical institutional practices

Dovetailing with the tactical engagement, these institutional practices now map in a direct way to the greater and more exciting aspects of grander business needs. In the process of this mapping, and stakeholder approval, these practices gain credibility in importance.

It is important also to realize who is establishing what.

The business objectives are of course established by the corporation seeking improvement. Some guidance from the mentor may be required. For example, measured improvement for something that is not currently measured, first requires "making measurable what is not yet so."9 Hence these business objectives start with: "Begin to measure and experience..."

The institutional practices are driven largely by the process mentor. Too often clients will try to include idealized past practices that proved impractical because they think: "Well, we just didn't enforce it hard enough." To start what will become an ongoing evolution, the mentor brings to bear practices that have worked well in industry, and that reflect some of the experiences from the preceding tactical engagement, as well as some experiences that are perhaps being implemented for the first time with the current client.

Now one can begin to appreciate the degree to which these practices are expected to improve these objectives. Notice how, in Figure 14, each practice is very specific on how it addresses the objective. The goal here deals with budgeting. And each and every practice describes how it supports budgeting.

This relationship is quantified in Figure 15. This scorecard, like the one in the tactical engagement, carries a vivid message except this time it is tied more quantitatively to the business objectives. One must be clear: The score card at this stage represent expectations as opposed to measured results. It is the rationale for moving forward.

Figure shows a printed score card

Figure 15: Strategic scorecard: Shaping the future
Click to enlarge

Looking at this scorecard presented in Figure 16, you can see that:

  • Section A is a listing of the of the business objectives. In this case there are four of them. (See also Figure 14.)
  • Section B itemizes the Institutional Practices that implement these practices. In this case there are forty-three.
  • Section C is a progressive build of infrastructure elements in place at various points in time. Infrastructure includes the tools, techniques, and training being employed across the workforce.
  • Section D is a computed output representing the expected degree to which the business objectives will be achieved at various points in time.
  • Section E is an input representing the degree to which the specific Institutional Practice in B will be fulfilled by the Infrastructure in C at various points in time.

Walking through this you can see in Figure 17 that a relative weighting is applied between sections B and E for each practice, identifying the relative degree to which a practice is expected to fulfill a specific business objective.

Figure shows numerical weighting applied to each area of business objectives

Figure 16: Relative importance of institutional practices

So for example, toward the fulfillment of business objective #1, that of controlling budgets, "Budget based on measured artifacts" is considered of weight '5', which is more important than "Tracking budget by accountability," weight '3'. Here too the judgment of the experienced process mentor presides, at least on this initial rollout of process.10

Looking back at Section E in Figure 15, you can see an area of input representing the degree to which the process infrastructure at various points in time (in section C) helps toward fulfilling specific Institutional Practice (in section B).

So, for example; Tools, techniques, and talent that enable "Cross referencing needs and dependencies in commonly reviewed documents", also known as tracing, is considered of "High" relative support toward "Budgeting based on measured artifacts...", but of "Low" relative support toward "Realistically evolve project budgets...".

Figure 17 shows the results of this effort.

Figure shows strategic process scorecard

Figure 17: What process infrastructure to build when

The specific progression of infrastructure elements shown in Figure 17 fulfills the business objectives to varying degrees as time proceeds. Here, you can see that the client has elected to build the infrastructure with the following progression:

  • Set #1 [First provide] cross referencing needs and dependencies in commonly reviewed documents.
  • Set #2 [Then] add use of UML.
  • Set #3 [Then] add UML cross referenced between models and code + Architectural analysis.
  • Set #4 [Then] add automated testing of Code + Manual testing of behavior.
  • Set #5 [Then] add Tracking exchange and acceptance of project artifact.
  • Set #6 [Then] add Corporate wide planning tied to actual resource assignment.
  • Set #7[Then] add Integrated time tracking.
  • Set #8 [Then] add End user error & enhancement collection, reporting, and cross referencing.

In following this progression, the client anticipates first improving the business objectives of productivity and quality, but deferring realistic scheduling, as the infrastructure elements that support resource planning and portfolio management are delayed.

All the elements of the process infrastructure should support getting the job done, and then measuring the results. And so discrete performance measurement is a vital element for gauging what is working and what is not working. Each is its own probe into the process engine.

Examples of such metrics include:

Measuring market cohesion

  • Customer future feature ranking
  • Customer valued feature?
  • Customer realized feature ranking
  • GOR

Measuring requirements stability

  • Number of features/iteration
  • Change to content of features/iteration
  • Stability of features
  • Number of Use Cases
  • Change to content of Use Cases
  • Stability of Use Cases
  • Number of Scenarios
  • Change to content of Scenarios
  • Stability of Scenarios
  • Requirements change orders opened
  • Requirements change orders assigned/not assigned
  • Requirements change traffic
  • Stability

Measuring design stability

  • Defect count
  • Date of report
  • Date of closure
  • Source: Internal/Client's customer
  • Release: Prerelease/Post release
  • Features involved (Direct/Incidental)
  • Use Cases involved (Direct/Incidental)
  • Scenarios involved (Direct/Incidental)
  • Elements of architecture involved (Direct/Incidental)
  • Software Lines of Code (SLOC) discarded
  • SLOC created
  • SLOC modified
  • Total SLOC
  • Mean time between failures
  • Product Maturity
  • Latency to close defect
  • Defect backlog

Measuring cost management

  • Estimated next phase cost to complete
  • Next phase cost outlook stability
  • Actual next phase completion cost
  • Next phase cost prediction accuracy
  • Estimated total cost to complete
  • Total cost outlook stability
  • Actual total completion cost
  • Total cost prediction accuracy

Measuring schedule management

  • % Iteration objective achieved
  • Estimated total time to complete
  • Total time to complete outlook stability
  • Actual total time to complete
  • Time prediction accuracy

The result of a well-instituted process is an ongoing commitment to achieving the desired aims of the corporation. But rather than evolve this process based on the popular winds of the day, the strategic engagement institutes this three-layer approach of business objectives, practices, and infrastructure, as illustrated in Figure 18.

Figure shows forces involved in strategic evolution of business management

Figure 18: Strategic evolution

Figure 18 is an expansion of what I presented in Figure 11. Now, however, business objectives are the drivers, and represent the most intransigent aspect of this effort. These objectives may evolve as downstream lessons are learned, but more often it is their interpretation though the practices that require changes. By far, the most fluid aspect of this overall effort is the infrastructure. The tools, the training, and the techniques should be under constant refinement.

With such a tiered understanding of intentions, a consistent evolution of process can be maintained under the sometimes chaotic onslaught of changing needs and interpretations.

Summary: It's a matter of culture

Process improvement can be a major disruption to the status quo. This will especially be the case in organizations with low visibility regarding measurable, quantifiable objectives. But inevitably, corporate cultures, like the greater human cultures they serve, must change as they strive to prosper amid uncertain and changing surroundings. Advancement can only be achieved by sustained and measured observations, along with the strong desire to achieve goals that suit our higher aspirations.


1 Napolean Hill (1883-1970) is known as "the father of The Science of Success."

2 The role of process mentor is best achieved by an individual from outside the organization who will function as an independent and objective change agent.

3 For now I am proposing the more "artistic" approach of intuitive plotting.

4 The primary point of contact is presumed to be the principal inside change agent for the client.

5 Capability Maturity Model

6 Leader to Leader, No. 10 Fall 1998: Winning at Change by John P. Kotter

7 Detailed below

8 Numbers have been changed from the actual company artifact.

9 Galileo Galilei

10 Note that the column that contains these weighting values is hidden in the previous figure. It is best that the selection of "Low," "Med," and "High" is done blind to these values.



developerWorks: Sign in

Required fields are indicated with an asterisk (*).

Need an IBM ID?
Forgot your IBM ID?

Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name

The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.


All information submitted is secure.

Dig deeper into Rational software on developerWorks

ArticleTitle=The art and science of IT enablement