Improve the value of your CLM reports by using metrics

A guide to using metrics with Rational Reporting for Development Intelligence and Rational Insight

The Rational solution for Collaborative Lifecycle Management (CLM) is a set of seamlessly integrated applications that work together as one platform for managing the complete lifecycle of your development projects. The application and lifecycle data that your teams create collaboratively for their projects is provided to you for reporting by CLM in a data warehouse. Although CLM includes more than 200 sample reports, with the addition of either the Rational Reporting for Development Intelligence (RRDI) component or IBM Rational Insight, you get design tools for customizing CLM samples and creating your own reports. With these tools, you have access to the data in the warehouse, which is grouped into two main categories: operational data (ODS) and metrics. This article gives you an in-depth look at the metrics available and how to use them.

Peter Haumer, Senior Software Engineer, IBM

Peter Haumer's photoPeter Haumer is a software architect for Portfolio Strategy and Management solutions in Rational software. Before joining the PSM team, he was the component lead for Reporting of the IBM Rational Quality Manager application. Find out more about him on the Rational Experts web page.



18 September 2012

Also available in Chinese

The Rational solution for Collaborative Lifecycle Management (CLM) is a set of seamlessly integrated applications that work together as one platform for managing the complete lifecycle of your development projects. The application and lifecycle data that your teams create collaboratively for their projects is provided to you for reporting by CLM in a data warehouse.

Although CLM applications include more than 200 sample reports, with the additional IBM® Rational® Reporting for Development Intelligence component ("Rational Reporting" hereafter) or with IBM® Rational® Insight performance measurement and management software, you have access to powerful report design tools for customizing CLM samples and creating your own reports. With these tools, you have access to the data in the warehouse, which is mainly grouped into two categories: operational data (ODS) and metrics. There is already detailed instructional material available for creating ODS reports (see citations 2, 3, and 4 in Resources), so this article gives you a more in-depth look at the metrics available and how to use them. It focuses less on the authoring experience, which is covered by a new set of video tutorials (citation 5 in Resources). Instead, this article answers these frequently asked questions about metrics and CLM:

  • Which metrics provided by the product-independent warehouse are recommended for use with CLM applications?
  • What do the metrics' measures and dimensions mean in CLM terms, and where does that data come from?
  • Which metrics are provided through the CLM Data Collection jobs, and which require the Rational Insight Data Manager?

Background

If you had no exposure to the warehouse metrics until now, you might wonder why these questions come up when authoring reports for CLM. The main reason is that the warehouse used by CLM follows the same schema that Rational Insight uses. This schema has been designed to provide a product-independent representation of development data so that it can be used with all Rational software, as well as external products that Rational software integrates with. In addition to the schema, we also provide a common core Reporting Data Model, which is the model that provides end users access to the warehouse data with user-friendly and localized names that you see in the report design tools. This Reporting Data Model defines its own product-independent terminology, which does not necessarily match your applications' terminology, because different applications use different terms for similar things. Moreover, not all warehouse tables are populated by all applications in the same way, because each application stores different data even if they operate in the same domain.

For example, you find different information in the warehouse, such as requirement attributes, when you use IBM® Rational® Requirements Composer, IBM® Rational® DOORS®, or IBM® Rational® RequisitePro®. Another example is that work items from CLM's change and configuration management (CCM) application, IBM® Rational Team Concert™, are represented in the same warehouse table, called Requests, as change requests that are managed in IBM® Rational® ClearQuest®. Both of those applications are highly customizable; therefore, some data columns can be used differently by each application.

However, there is also often a very large intersection of data that is common. This intersection can then be used for creating common reports that can show data from all apps together, based on the data from this one warehouse. For example, the IBM® Rational® Quality Manager, the QM application for CLM, integrates with all of the requirements management applications listed above, and all sample reports provided with Rational Quality Manager can show requirements data linked to test artifacts from any of these applications. You can even show a mix of these if you use more than one of these requirements applications.

The warehouse can also grow with your integration needs. You might start with CLM and Rational Reporting, and then later add more Rational or external applications to the mix. As mentioned before, the CLM warehouse uses the same core warehouse schema and Reporting Data Model as Rational Insight. (Insight and CLM have some private schemata as well, but these also fit nicely into the scenarios described here.) Rational Reporting currently supports only the three CLM applications, but by switching to Rational Insight, you can expand your existing CLM warehouse to become an enterprise-wide warehouse that also stores data from other applications, such as the ones mentioned in the previous paragraph. In addition to supporting data from applications other than CLM, Rational Insight provides more sample metrics than CLM does. See citation 6 in Resources for the Rational Insight overview of these metrics. CLM provides data for a subset of these metrics, and this article tells you which additional metrics you get when using Rational Insight.

To summarize this background discussion, the advantage for report authors is that their reports can work with the data from different applications. But the challenge is to map the terminology used by the Reporting Data Model, as well as to know which items of this Reporting Data Model are populated by CLM or any other application. This is true for operational data, as well as metrics. For operational data, you can find detailed documentation in the Rational software information center: Reporting > Reference information for reporting > Reporting data dictionaries. For metrics, we provide this article.


The benefits and risks of using metrics

As mentioned previously, there are two categories of data in the warehouse:

  • Operational data (ODS stands for Operational Data Store), which is typically used for query-like list reports that show a representation of the raw application data in the warehouse, such as the list of test cases and their attributes.
  • Metrics, which provide an analytics view of the data. In other words, they represent processed data by taking the ODS data as input and aggregating this data into measures, based on some interpretation of it. Measures are typically counts of data items, such as the number of requests, or the number of test execution results. They can also be aggregated sums of numeric data, such as the sum of all weight points of test case execution records. To qualify these measures, they are collected in relation to a number of dimensions. You can use these dimensions to get to the specific numbers that you are looking for quickly.

Figure 1 shows you how such a metric is represented structurally in the report development tools. You see a breakdown of the Request metrics into measures, indicated by the ruler icons, such as Actual Work and Total Requests, as well as dimensions, such as Category or Date, indicated by the axis icons. When you create reports, you basically drag and drop from this tree into a report component, such as a chart graph. You use the dimensions, for example, for a metric in which you might want to filter the Total Number of Requests to the ones that are owned by Project X, are of the Defect type, are in the Open state, and have no owner assigned (owner is referred to by the metrics as Resource in Figure 1). Hence, project, type, state, and owner would be dimensions of a Request metric, with the measure number of requests. You can then call the application of this metric with this set of dimensions a report that shows the "Number of open defects that have no owner."

Figure 1. The structure of a metric showing the measures and dimensions
directory or tree structure

Metrics are collected at regular intervals. In the CLM case, they are collected daily, so they provide not only the most recent view of the data but the trends over time. Using the previous "Number of open defects that have no owner" example, the collected trend information of this measure would then allow you to plot a graph in a report showing the number of defects with no owner over time, using the Date dimension. Therefore, you can provide information about the consistency in which incoming defects are triaged to owners.

The interpretation and usefulness of this metric and the reports that show it are now very much dependent on your development process. If your process values the consistent allocation of defects to owners to encourage immediate fixing, then this report is more useful than if your team just picks defects from a ranked backlog whenever they work on defects in allocated time slots. Those who read this report also need to understand that it shows only the trend of open defects, not the rate of incoming new defects nor the time it takes to assign them. So you might require multiple metrics to support the goals of your development process.

As already mentioned, the use and usefulness of metrics is highly dependent on your development process, and metrics included with a CLM application could be either very process-specific or fairly general, so that they work in many cases. The authors of the metrics available in the CLM solution and Rational Insight chose the latter approach. The dependency of the metrics against the process also holds true for the definition of the metrics. In the example above, the process might change what actually constitutes a defect. For example, a process in Rational Team Concert that provides the input here for counting requests might define completely different types of work items, where the mapping to Defect might be difficult. In the simplest case, defects might be called bugs or, in another case, the type of the work item might be generic, such as Change Request, so that it is not immediately possible to distinguish defects from enhancement requests. Moreover, the notion of the Open state could be different in each process, because the workflow of work items in Rational Team Concert or requests in Rational ClearQuest can be completely user-defined. Given that each project contributing to the warehouse might use a different process, designing a warehouse that provides useful metrics that work with any kind of process is a major challenge. As a user of metrics, you need to evaluate each metric in terms of these criteria:

  • What data does it use from each application that provides data in that domain?
  • Which development process and application customization were used to create the data, and what does that mean for the metric?
  • Which process is used to interpret and use the metrics?

Getting started with metrics

You write reports that use metrics by selecting a measure and dimensions and then applying them to various report components in the report design tool, such as graphs or cross-tabs. You also define prompts that remind you to enter filter values on the dimensions. In the previous example, this could be the specific state to be used (which would eliminate the problem mentioned above of defining what Open means), the projects to focus on, and so on. Because it is generally well known that a picture tells more than thousand words and a video tells more than ten thousand words, we have prepared a couple of demonstration videos that walk you through the report-writing steps. Therefore, if you have not seen these videos on YouTube yet, please stop reading now, watch them, and return here afterward:

  • "Introduction to IBM Rational RRDI v2.0 Report Authoring"
  • "Using the IBM® Cognos® Query Studio v10.1 User Interface and Data Package"
  • "Build a list report in Cognos Query Studio v10.1"
  • "Build a Crosstab and Chart in IBM Cognos Query Studio v10.1"

[Intermission]

Welcome back! The video titled "Build a Crosstab and Chart in IBM Cognos Query Studio v10.1" showed you how you can write a cross-tab and bar chart report by using a metric that counts test execution results. It used the Query Studio tool that allowed dragging in query items and immediately ran the current report.

To start your own exploration of the metrics available, we recommend that you use this same tool and begin experimenting. Using the sections that follow, you can start creating similar reports by using the metrics that you are interested in with your own data. While exploring the metrics with your own data, you can better judge how well each metric fits into your development process and what data you have available in your application setup.


A review of metrics for CLM

The following sections provide a review of key metrics that we think provide relevant data for use with the CLM solution for a range of interesting scenarios. In the Rational Reporting and Rational Insight report design tools, metrics are presented by the Reporting Data Model in a tree browser view that groups metrics by domain:

  • Change Management
  • Configuration Management
  • Project Management
  • Quality Management
  • Requirement Management
Figure 2. Metrics are grouped by these domains
screen capture of the tree view

This mapping might not meet your expectations sometimes, because many metrics use data from more than one domain. The rule of thumb for locating a metric is that the data used for the metric's measures comes from that domain. For example, "Request Metrics with Test Plan" would be used primarily for Quality Management reports. However, because it provides measurements of requests (counting defects found during testing of specific test plans), it has been placed in the Change Management group.

The following sections review each of these groups except the Project Management one, which is not used by CLM. In the Configuration Management group, only the Build metrics provide data related to CLM, but those will not be discussed here either.

Common dimensions

Before you review the individual metrics, however, go through the list of common dimensions as they are used throughout the groups. The focus of Table 1 is on dimensions where the mapping from CLM terminology is not obvious or where additional considerations are required. It is an exhaustive but not complete list.

Table 1. Common domains used by the metrics
Dimension nameCLM data sourceUse
Category Work Item Category In the Rational Team Concert UI for work items, it is shown as the File Against field.
Classification (not applicable, N/A) This dimension indicates the origin and type of data such as RTC Work Item (Rational Team Concert), RRC Requirement (Rational Requirements Composer), or RQM Execution Result (Rational Quality Manager). Therefore, it lets you distinguish requests coming from Rational Team Concert from the ones from Rational ClearQuest and distinguish requirements from Rational Requirements Composer from those in Rational RequisitePro, and so on.
Creator Work Item Creator The Creator role, meaning the user who created a work item, has been added to all Request metrics in the CLM 2012 release.
Date (See comments) If the word Date is in the name of the metric (Execution Result Date metrics, for example), the actual creation or change dates of the ODS data element mentioned in the metric name is used (the actual start date of the execution results). In rare cases, the word Date is not in the metrics name, but then the date dimension is actually called differently, such as Closure Date in the Request Closure metrics. In all other cases, the Date dimension represents the data collection date, meaning the date when the data was loaded into to the metric's fact table. Therefore, most metrics provide trend data over these collection dates, thus a snapshot of the data at the time of the daily data collection.
Iteration Work Item Planned for Test Plan Phase (Iteration) Depending on which metric you are using, this dimension refers either to the Planned For iteration for which a work item is to be resolved or the time period in which tests assigned to a test plan are to be executed within the plan's test schedule. Generally, if a metric has "with Test Plan" in the name, Iteration refers to the test plan schedule in CLM 2012. If such a metric provides measures on requests, the Iteration dimension indicates the test phase in which the request were created, or the phase in which defects were found during testing. In CLM 2011, these "with Test Plan" metrics were broken, and the Iteration dimension cannot be used here.
Project Project Area This dimension will reference the project area of the artifact for which the measures have been defined. For example, in Request Metrics with Test Plan option, Project will be the project area of the work items that populated the Request table, not the project area of the test plan.
Release Work Item Found In Release is populated with the work item's Found In information. The source of the values are defined as Releases for a Team Concert project area.
Request Priority Work Item Priority Represents priorities defined for work items. Each Rational Team Concert project area can potentially define its own set of priorities for its work item types, and all priorities are regarded as individually different. In other words, when you use this dimension, you might find many results for High, Medium, Unassigned, and so forth, because each project's priority will be grouped individually.
Request Severity Work Item Severity Represents the severities defined for work items on a project basis. The comments above about Priority apply.
Request Status Work Item State Groups States are grouped in Rational Team Concert into state groups. This makes work item states somewhat comparable across the individually customized states of each project. Therefore, this dimension should be used for state-based reports that show data from more than one Rational Team Concert project area.
Requirement Requirement Allows you to show measured data, such as execution results to be grouped by linked requirements. In CLM, requirements are linked to test cases and to test plans through collections, as well as so-called Implementation Requests, which are Stories in the Scrum process template.
Resource User Resource is the data warehouses name for user. Its usage is different, depending on the metric. In most cases, however, it represents the owner of the measured artifact, thus the owner of a work item. (See the Creator dimension, which is another dimension for users.) But here, the meaning is clear by the name.
State Work Item State
Test Case State
State is the project area-specific state of artifacts. In CLM 2012, this dimension covers work item states in Request metrics and test cases in Test Case metrics.
Team Team Area This dimension represents team areas of the project area of the artifact that the measures are related to.
Test Case Test Case Allows you to show measured data, such as execution results to be grouped by linked test case.
Test Plan Test Plan Allows you to show measured data, such as execution results or defects (requests) to be grouped by linked test plans.
Type Work Item Type
Requirements Type
Represents the project area-specific work item or requirement type. If you want to write a report that measures Defect, you would define a filter here. But you need to be aware that every project area might have its own set of work item types, and it is possible that not all of those label defects as Defect.
Verdict Execution Result Actual Result This dimension represents the test result categories such as Passed, Failed, and Deferred that are available for Rational Quality Manager execution results. Although they are customizable on a per-project basis, the current implementation of the data collection assumes one set of values, because they are provided by the default process template for Rational Quality Manager projects.

Table 2 shows dimensions that you would, perhaps, expect to provide data from CLM, but they do not. If you add them to your reports, you will typically get an empty result or the result will show "Info not Available." Therefore, these dimensions should not be used for CLM-specific reports.

Table 2. Domains that are not populated by CLM data
Dimension nameComments
Component Not the SCM components from Rational Team Concert. Not used by CLM.
Fixed in Release Does not provide values in CLM, because work items are resolved in iterations, and the link to the release is missing in the data provided by CLM to the data warehouse.
Platform No information for platform is provided by CLM.
Product This dimension does not provide data from CLM.
Project (by Portfolio) This assumes the grouping of projects by portfolio. However, CLM does not use portfolios.

Change management metrics

Change management metrics provide you with measures relevant to your Rational Team Concert work items. They can be used for scorecards or bar charts that give you counts on open defects, as well as for trends of the creation or closure of defects or work items of any type. Figure 3 shows a sample report that is included with Rational Quality Manager (with data from our own use of Rational Team Concert and Rational Quality Manager for development and testing of these products).

The report uses the Request Creation and the Request Closure metrics that you see documented further down in Table 3. This report compares the number of arriving and resolved defects linked to testing that follows a test plan. It can be used to evaluate the quality and team efficiency in being ahead of fixing defects compared to discovering new defects during testing.

Figure 3. QM sample report: Defect Arrival and Resolution Over Time
trend graph

The following table reviews each of the Change Management metrics that can be used with CLM. For each metric, this table lists which measures work with CLM data and points out any specifics that report authors need to know. There were changes to these metrics between CLM 2011 and 2012 (IBM Rational Insight 1.0.1.1 and 1.1 and Rational Insight 1.1.1, respectively), and those are mentioned here. The table also tells you if a metric is available with CLM Data Collection jobs or if it requires the installation of IBM Rational Insight and its Data Manager ETL.

Table 3. Configuration and change management metrics that provide results with CLM data
Metric nameMeasures that use CLM dataMeasures that do not work with CLMUseRequires Rational Insight ETL
Release Request Metrics Total Request of Pre- Release,
Total Request of Post Release
Total Work The two measures for pre- and post-release look at the time difference between the availability date of the release versus the creation of the work item to determine whether the request was found before or after the release. However, this metric has only a limited usefulness, because it looks only at dates and always considers all work items without boundaries of a project area, because work items are not associated with the release. Therefore, the sums of post-and pre-release will always be the same: the number of all work items in a project. Yes
Release Request Turnaround Metrics (All) This metric follows the approach of the metric above, but it looks at specific statuses (see Status dimension above) and the time that work items stayed in these statuses. The measures provide information about the maximum, average and minimum days for a work item to move from Submit state to Resolve state pre- and post-release. The measurements include total number of requests, maximum, minimum, and total days requests spent before and after the release. Yes
Request Aging Metrics [with Test Plan] (All) This metric provides information about the maximum, average and minimum days that requests spend in a particular state. It uses process-specific states, not the status, in this case.
The "with Test Plan" variant of this metric adds the test plan dimension to the metric. That dimension is essential for testers to group work item (defect) measures with test plans. To address the needs of testers, the Iteration dimension here refers to the test plan schedule and the iterations defined in Rational Quality Manager; whereas, the other metric’s Iteration dimension refers to the Rational Team Concert iterations assigned to the work items as Planned For values.
Yes
Request Closure Metrics
[with Requirement][with Test Plan]
Closure This metrics provides information about the closure rate of requests. Closed refers in this case to the Closed work item group and not specific states with the name Closed. In other words, the measurement is total number of requests that are in the Status of Closed (see Table 1 for more details about the Status dimension). The Date dimension is used for this metric to express the actual date when request where moved to a state in the Closed state group. The metrics variants with Requirement and with Test Plan allow you to access the measure for a specific (set of) requirement or test plan.
Note: The Request Resolution metrics are not listed in this table, because they do not work for CLM as there is no work item state group called Resolved. States with the name Resolved are typically added to this Closed state group.
No, for the base metrics
Yes, for the two variant metrics
Request Creation Metrics [with Requirement]
[with Test Plan]
Arrival Actual or Planned Duration, Story Points Similar to the closure metrics, this metric uses the Date dimension to count requests created by using the Arrival measure. No, for the base and Test Plan variant
yes, for the requirement variant
Request Failure Metrics
[with Requirement][with Test Plan]
Total Requests This metric will work with your CLM data if you have a work item type that is spelled Defect. Localized names will not work. For these requests, it counts the ones where Status was transitioned from Closed to Open (see Table 1 for more information on Status). The interpretation of this measure could be that these defects failed validation or re-testing and had to be reopened. Yes
Request Metrics [with Requirement]
[with Test Plan]
Total Requests, Actual Work, Planned Work Total Blocking Requests This is a basic metric that counts requests (work items, for example) by using key dimensions, including variants that include linked test plans (to report on defects found during testing of a plan) or linked requirements. It also provides measures for actual and planned work, which can be used for iteration burndown charts that are based on the work items Estimate, Correction, and Time Spent fields. Divide the measure by 3600 to get to hours. The Blocking measure does not work with CLM, because it does not provide this information in the ODS Request table column, and the metric relies on that information. No
Request Turnaround Metrics [with Test Plan] (All) This metric provides information about the maximum, average, and minimum days for a request (could be a defect related to a test plan when using the variant metric) to move from the Open status to Closed. Yes
Story Metrics Total Story Points, Total Comments, Total Subscribers Total Tags This metric provides various measures for work items for the type that is spelled Story. Localized names will not work. You need to use at least Rational Team Concert 3.0.1.2 or later for the metric to work. For these Story work items, you can then report on the total number of stories, the number of comments submitted against the story, and the number of subscribers. The Total Tags measure actually provides only the number of work items that have tags and does not count the number of tags assigned to work items. Yes

Quality management metrics

Quality metrics refer mainly to test execution results and trends, but they also cover areas such as the use of lab assets or the status of test case authoring within a project. Figure 4 shows the Execution Trend, a very popular sample report included with Rational Quality Manager that uses these metrics:

  • The Execution Result Points metrics with the iteration
  • The Execution Work Item metrics with the iteration
  • ODS tables that contain plan data to show a team's test execution performance

The report in Figure 4 plots points for executed tests (points: a metric that allows expressing the effort of running tests), comparing attempted points to completed points for each week. It presents three pairs of these trends:

  • Planned, which was an allocation of estimated points over time in the test plan schedule (see the "Building a Report in Rational Common Reporting" video cited in Resources).
  • Computed, which is a summation of the points estimated for each test case execution record assigned to the plan and milestone provided by the second metric
  • What the team actually executed by a certain date provided by the first metric.
Figure 4. Execution Trend report shows a teams estimated vs. actual test performance
Screen shows parameters summary and trend graph

The following table lists all metrics that can be used with CLM data, following the same schema as the CCM metrics discussion.

Table 4. QM metrics that provide results with CLM data
Metric nameMeasures that use CLM dataMeasures that do not work with CLMUseRequires Rational Insight ETL
Execution Result Date Metrics (all) This metric provides a summary of execution results, using the result's actual start date for the Date dimension. Other execution result metrics populate the Date dimension with the data collection date and count all of the results available at that date. his metric counts results in relation to the actual start date of the execution, instead. Therefore, it counts only results started at that date. The measures provided reflect the individual execution result's verdicts, such as Attempted, Failed, or Deferred, as allocated by the tester, using the sliders in the execution result web UI. No
Execution Result Metrics [with Iteration] Total Execution Results A simple metric that allows you to count the absolute number of execution results available at the time of collection. You can use the Verdict dimension to focus on specific results, such as Attempted or Failed. The "with Iteration" variant of the metric requires use of the Iteration dimension, which refers to a test phase in the schedule of a test plan. No
Execution Result Point Metrics [with Iteration] (All) This is one of the metrics that is used for the Execution Trend report shown in Figure 4. It is used to provide the Actual values of the graph. It provides measures for performed execution points by verdict available at the time of the data collection. The iteration variant requires the use of the iteration dimension in the same way as discussed in the row above. No
Execution Work Item Metrics [with Iteration]
[with Requirement]
Total Execution Work Items
Total Weight
This is the other metric used by the sample report of Figure 4. It is used to provide the Computed values. It provides measures that count the total number of test case execution work items available, as well as their total weight. Therefore, when using the Iteration variant of the metric, it can be used to access the number of execution work items and their specific sum of points allocated to a particular iteration. You can use the other variant, "with Requirement," to filter the metric with sets of specific requirements to write reports that show when requirements are planned for testing. No
Job Total Number of Job Results This metric provides information about a job execution result based on a lab resource. No
Lab Utilization Metrics Total Number of Machines
Reserved Machines
This metric provides information about use of lab resources, based on their reservation date. No
Test Case Metrics [with Iteration]
[with Requirement]
Total Test Cases This is a simple metric that counts the total number of test cases. But with various dimensions, it can be used for reports that show the trends of the state of the test cases for specific test plan iterations, as well as considering linked requirements. No

Requirements management metrics

This final section walks you through the requirements metrics that work with CLM data. In general, Rational Requirements Composer does not provide any data about changes and requirements versions. Therefore, no measure that refers to changes works, currently.

Table 5. Requirements management metrics that provide results with CLM data
Metric nameMeasures that use CLM dataMeasures that do not work with CLMUseRequires Rational Insight ETL
Child Requirements Metrics Total Children Requirements This metric counts the number of requirements that have a parent requirement. Yes
Requirement Metrics [with Test Case]
[with Test Plan]
Total Requirements Total Changes These three metrics provide counts for the total number of requirements and requirements that are linked to test cases or test plans through requirements collections. The latter would also count requirements directly linked to test plans, but that is not supported by CLM's requirements management application, Other requirements management tools, such as RequisitePro, do use this relationship.
Note: Several dimensions of these metrics might not work with your requirements management project, because all attributes in requirements management are custom-defined. Only if you have attributes such as Priority or Stability that are spelled exactly as the dimension names will they provide data for these metric dimensions.
No
Requirement Traceability This entry in the tree is actually not a metric, but you can use it to access various ways of requirements that trace to other CLM artifacts. As with the metrics and dimensions listed above, not all entries here are supported by CLM data. Traces to Activities, Customer, and Tasks are not available, because CLM does not cover those concepts. No

Conclusions and next steps

This article walked you through the relevant metrics available to you using Rational Reporting for Development Intelligence or Rational Insight with CLM. You also learned which parts of the metrics work, which parts do not because the required data is missing in CLM reportable REST APIs. For easy reference later, tables also listed which metrics are available when using CLM Data Collection and Rational Reporting and which metrics require the extended Rational Insight ETL for data collection.

As a next step, start exploring these metrics. Use Query or Report Studio as show in the videos (see the YouTube videos playlist, in Resources), and start creating reports that use the metrics listed in this article. Experiment with various dimension filters.

It will also be helpful to review the more than 100 Cognos sample reports that are included with CLM. They use most of these metrics in various ways. Open these reports in Report Studio and try to understand the queries that are used to build them.


Acknowledgments

I want to thank my colleague Ashish Apoorva for creating an early version of the Quality Management metrics review and Jun BJ Wang for feedback on those. Thanks to Amanda Brijpaul for feedback on the article and for recording the videos.

Resources

Learn

Get products and technologies

Discuss

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Rational software on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Rational, DevOps
ArticleID=833831
ArticleTitle=Improve the value of your CLM reports by using metrics
publish-date=09182012