Contents


Discover cross-project reporting with the IBM Jazz Reporting Service

Comments

JRS provides a new set of ready-to-use reports as OpenSocial gadgets for dashboards. These reports focus on cross-project collaboration application lifecycle management by using data of interrelated artifacts of the change and configuration management (CCM), quality management (QM), and requirements management (RM) applications. (To learn more about the various applications and products, see this overview on jazz.net.) The JRS application that renders these reports is easy to deploy on its own or in an existing Rational solution for Collaborative Lifecycle Management (CLM) application server. The JRS application provides an integration with the Collaborative Lifecycle Management dashboards through the common widget catalog. After it is set up, CLM users can find 21 reports in the widget catalog. This article walks you through several of these new reports and describes usage scenarios for them.

Getting started with the dashboard gallery

After you have installed and configured the Jazz Reporting Service and integrated it with your CLM servers, your users can immediately access the new set of reports for their dashboards. To open the dashboard gallery, click Add Widget and select the new Jazz Reporting Service catalog. Start browsing through the gallery of reports. The thumbnails next to each report show several kinds of chart reports, including tables. Click the report name to see a detailed description, as shown in Figure 1.

Figure 1. JRS reports gallery in a CLM dashboard
Sample dashboard shows report widgets
Sample dashboard shows report widgets

After you add the report to the dashboard, you are prompted for filtering parameters that set the scope of the report to the projects and development iterations that you are interested in and for which you have access. Such selected filters are then saved with the dashboard and can be changed at any time by clicking Filter or Edit.

The following sections show you a few sample reports available to you for various usage scenarios. The data shown in the following examples is taken from a fictional, simplified demonstration and is not representative of a full, real-life usage.

Viewing progress with cross-project status reports

If you have worked with CLM reports in the past, you know that all reports provided by CLM are using a single-project area scope. Even reports that provide traceability across the CLM lifecycle projects start by selecting one quality management, change and configuration management, or requirements management project area and following the traces to linked project areas from there. JRS reports are typically cross-project; you can compare data from multiple projects, teams, or timelines with one another.

If you are running a program of multiple projects or releases that include multiple applications or components that are being developed by multiple teams in parallel, you might be interested in high-level overview reports of the status of these releases. You might also want to run reports with more detailed information so that you can compare specific aspects of these parallel releases. For example, you might want to compare how each team is progressing with burning-down story points in the current iteration, or how they are doing fixing defects. The Release Status report shown in Figure 2 provides a quick overview of the progress.

Figure 2. Release Status report compares story points of different application components
Critical story points for various components
Critical story points for various components

In the example in Figure 2, the data for the current development iterations of four applications that are being developed in one program and released together as one suite of applications is rolled up to the release level. This report focuses on the completion of the total number of stories in the current development iterations. By hovering over the chart with your mouse in the dashboard you can read the exact numbers represented by each bar segment.

Figure 3 shows a second report using the exact same data query, but presents more details of that query as a table. This report also counts open versus closed stories, but breaks these numbers down by team area. It also adds counts of defects that have been fixed by each team and shows how many defects are still open. This view gives you a quick glance at the amount of defect work that affects regular story burndown.

Figure 3. Release Status list adds defect counts and breaks down progress by team
Aggregate defect counts and team progress
Aggregate defect counts and team progress

To drill even further into the ongoing development progress of these application releases, use another report that shows an even more detailed breakdown. In addition to counting the burndown of work, the Iteration Health report in Figure 4 shows the dates for each iteration and how many days of development are still remaining.

Figure 4. Iteration Health report breaks down iteration progress
Iteration dates and development days remaining
Iteration dates and development days remaining

The Team Velocity report in Figure 5 is another drill-down report that focuses on one development team of only one of the application releases. It provides the trend data for a team's burning-down story points in each iteration. Figure 5 shows how the team working on the Mobile Banking 1.6 application already completed the story work of 36 points and still has 8 points left to do for Sprint 1. The Sprint 2 bar shows that 37 points have already been planned for the next iteration, which is Sprint 2. It even shows the size of the product backlog.

Figure 5. Team Velocity report shows team’s story point progress iteration to iteration
Story points burned down by a team per iteration
Story points burned down by a team per iteration

Keeping track of the iteration scope and feature creep

In addition to tracking progress of planned work across development teams, JRS also provides you with reports for keeping track of your iteration scope. Speaking in scrum terms, if your development team values the stability of your committed sprint backlog, you want to keep track of items that are added or removed during an iteration. JRS provides two reports to do that.

Figure 6 shows the Scope Added report, which lists the items that are added to the current iteration of a selected project area timeline after that iteration had started. Because you might not want to show items of all types in such a list (for example, you might want to exclude newly discovered impediments or defects), you can use the Work Item Type filter to specify the types that you are interested in tracking. The example reports show three story items that were added to the current iteration Sprint 2 (1.6). For each item it shows the size, the creation date, and if applicable, which iteration it was added from, such as a newly created story or a story that is moved from another iteration or backlog.

Figure 6. Scope Added report keeps track of items added during the iteration
Mid-iteration changes in Scope Added report
Mid-iteration changes in Scope Added report

The Scope Removed report shows the opposite information using the same filters. It shows which items of specific types have been removed from the current iteration of a selected project area timeline. As shown in Figure 7, it shows the iterations (the next iteration or the product backlog, for example) to which the items have been moved.

Figure 7. Scope Removed report shows items removed from current iteration
Track removed items and new locations
Track removed items and new locations

Finding team dependencies and blockers

In a team of teams who are developing complex applications that include interdependent components or integrated applications, team members and development leads must be able to immediately see if issues owned by one team, block other teams. In the CLM CCM application you can create blocks relationships among work items such as defects. The Team Dependencies report depicted in Figure 8 can then be used by each team to show how they are blocked by other teams at the moment.

Figure 8. Team Dependencies report shows defects by another team
Blocking defects by team
Blocking defects by team

The Team Dependencies report shows only work items that are related by blocked relationships, where the related items are owned by two different teams. The report is filtered by a team area selection. The team area represents the team that owns the blocked items listed in the report by their severity. It then shows, for each of these blocked work items, the blocking items and the teams that own them.

Blocked items include work items such as defects, but they can also include defects discovered in the quality management test that block the completion of the test. In contrast to the Team Dependencies report, which shows how a development team is being blocked by defects assigned to other teams, the Blocking Work Items report in Figure 9 lists, for a selected development team, all the items that they own that block other teams' (including test teams') progress. This report shows items owned by the current team that meet one of these conditions:

  • The severity of the defect is Blocking
  • The defect is associated with a blocking test execution record in QM.
Figure 9. Blocking Work Items report shows team’s blocked items
Team's blocked items by severity or blocking test
Team's blocked items by severity or blocking test

Understanding the effect of defects

The previous section describes JRS reports that list blocking defects that directly affect a team's ability to make progress. To get a view of the team's overall defect situation and how these defects affect the delivery of the requirements committed to for the iteration or to see how defects affect the completion of your regression tests in this iteration, choose one of the following reports. Each of these reports can be presented in your dashboard as a list or as a bar chart.

  • Defects by priority for a team and the current iteration
  • Defects per scoped requirement, shown in Figure 10
  • Defects per test case by test plan

Figure 10 shows only the requirements for which defects were actually found. A defect is counted for a requirement in RM if it is traced to a user story in CCM that was scoped for the current release, and the user story was tested in QM. The report also shows how many of the defects found while testing specific requirements are still open and which ones are closed already. This report helps you assess the risks related to delivering on that specific requirement.

Figure 10. Defect per Requirement chart and list
Chart and list report
Chart and list report

The reports listed previously provide you with similar information. In the Defects Per Test Case report you can select one or more test plans and view the same defect statistics as are included in the requirements report, related to the test cases of that plan. The Defects by Priority reports renders a chart and a list of defects currently open for a selected team area planned for their current iteration and grouped by their priority.

Tracing your requirements

The following group of reports focuses on requirements traceability in general.

The Story Traceability report in Figure 11 provides development teams with a complete CLM traceability coverage overview. For the CCM user stories planned for a specific iteration, the report includes traceability links to story elaboration requirements in RM, links to the test cases for the stories in QM, and a count of the currently open defects that are found while testing these test cases. The report can be used to uncover gaps in the traceability such as missing test cases. It can also serve as a navigation aid as you open and review related items directly by clicking the active links in the report.

Figure 11. Story Traceability report shows traces: stories to requirements, tests, defects
Iteration, requirement, test case, open defects
Iteration, requirement, test case, open defects

The Story Traceability report in Figure 11 shows a defect count related to the testing of CCM user stories in QM. A more detailed view on the success and failures in testing is provided by the Test Execution per Requirement report depicted in Figure 12.

Figure 12. Test Execution per Requirement report shows test successes and failures
Bar chart shows blocked, failed, passed tests
Bar chart shows blocked, failed, passed tests

For this report you select a requirements project, a test plan, and the test iteration that you want to focus on. For each requirement that is linked to a user story with test cases, the report shows the number of tests performed and their success status. This report gives you a quick overview of the maturity of specific implemented requirements. It can be used to determine which requirements have problems that must be fixed and which are ready for product owners and external stakeholders to try out in the currently available integration builds.

Conclusion

This article shows you a subset of the ready-to-use reports available with the Jazz Reporting Service in CLM 5.0. It includes typical usage scenarios: cross-project status reporting and scope management, cross-team dependency analysis, how defects affect team progress, and traceability reporting. Watch the video demonstration and walkthrough of the reports on YouTube.

Additional JRS reports, not covered in this article, are also available. They provide additional views on status (such as open stories for a team in the current iteration) and they cover other CLM domains (such as change set activity or build success). I invite you to install the application, try them out, and provide feedback and requests for additional reports.


Downloadable resources


Related topics


Comments

Sign in or register to add and subscribe to comments.

static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Rational
ArticleID=980582
ArticleTitle=Discover cross-project reporting with the IBM Jazz Reporting Service
publish-date=08192014