Test Plan Results by Iteration
Value of this report
When evaluating the quality of a software release and its stability across several iterations, you can look at the test case results as they were accumulated across the iterations of a test plan. As the iterations progress towards release, the number of failing points in test cases should decrease or at least stay very low with respect to the number of passing test points. Hopefully as you approach the release, the number of blocked or deferred point also goes to near zero. By using a stacked bar graph you can easily represent the test points accumulated for each iteration and look for these trends.
At this point, we will construct a test plan results graphic keying on the test case execution records that are linked to test case result records. The test case execution records are specific to an iteration of a specific test plan. Graphing the test results stored as attributes in the test case result records is fairly straightforward exercise with the power of the JRS Report Builder. For this example, I am using the report construction workflow present in the 6.0.1 M2 milestone build that is available for download from jazz.net.
Test case execution records linked to test results records scoped by a test plan
In the Report Builder interface, first select Build and the first step is to choose a report type. Current Data should be the preselected value and you just click Continue.
Under Limit scope, you select the Quality Management project area where your test artifacts reside. In this case all related test artifacts reside in the same project area by product design. Click Continue to move on.
Under Choose artifacts, you select Test Case Execution Record artifacts as the primary record to be reported on. Click Continue to move on.
Under Optional:Traceability links, you select All related Test Case Result records and click OK.
Since the link type of Optional is correct, you just click Continue to move on.
Under Set conditions, you click Add condition, select Test Case Execution Record, select Test Plan Name, and choose the test plan (or plans) you want in your report. Click Add and Close to dismiss the dialog and continue to move on.
Under Format results, you first remove the project area name column since this is known and implied by the test plan. Click the X to the right and confirm the deletion when prompted. Then we need to add the missing data columns by clicking on Add attribute columns.
Next you add the iteration attribute of the Test Case Result records.
Then you add all of the point attributes from the Test Case Result records.
At this point all of the data is present but you want to sort it by test plan, iteration, and test case execution record name in that order. Click the Sort Type to Ascending in each of these columns in succession to get the right sort order.
Also you want your report to have test plan and iteration in the first two columns on the left side of your report so that the sort order cascades properly. Use the Action up and down arrows to set up the shown column order. When you have it the way you want it, click Continue to move on.
At this point you needd to name your report (eg, Test Plan Results by Iteration), tag it so it is filed in the right group of reports in the public catalog and set to Public to share your report. Click Save and Continue to move on.
Review the tabular version of your report to make sure it is selecting all the record data that you are interested in. Once you are satisfied the data is correct (and complete), we can go back and change the default format to a graph.
Click on Format results to move back and click Graph to set up your graphical parameters.
Instead of clumping the results by test plan, we want to show as categories using Iteration. Then we want to combine the numeric columns from the table in a Stacked Bar (set in Graph type), change it to Horizontal (set in Orientation), and exclude Points Attempted and Points Total since they don't represent distinguishing results. Once these changes are made, click Save to save your changes.
Run your completed report showing how the test results change based on the test iterations in your test plan.