IBM® Rational® Tester for SOA Quality automates the creation, execution, and analysis of functional and regression tests for service-oriented architecture (SOA) applications. In this article, you will take a look at reporting in IBM Rational Tester for SOA Quality. Specifically, you'll take a detailed look at using both the Test Log viewer and the various options available in a Web service performance report.
The Rational Tester for SOA Quality product is an extension of the Rational Performance Tester application. If you are not familiar with Rational Tester for SOA Quality or Rational Performance Tester, you should take some time to read some of the introductory articles included in the Resources section below.
This article was written using IBM Rational Performance Tester version 7.0.0, IBM Rational Tester for SOA Quality version 7.0.0 Open Beta, Microsoft Windows® 2000 Professional SP2, and the current (as of the initial publication date of this article) Google Web API.
This article will use Web service tests for the Google Web API. You can find a link to the WSDL for this service in the Resources section below. We have three tests in our test suite for this article, one for each operation in the API; the contents of these tests is shown in Figure 1 below.
Figure 1. Test contents for the GoogleAPI test suite
The first test makes a call to the
operation. It passes in the API
key and the
url for IBM developerWorks
(http://www-128.ibm.com/developerworks/). For this test case, we have an equal
verification point that looks for an exact match for the response message. I have
this test case set up to fail its verification point.
The second test makes a call to the
operation. It passes in the API
q value of "developerWorks," a
start value of 0, a
maxResults value of 10, and default values for
everything else. For this test case, we have a contain verification point that
looks only for this snippet on developerWorks:
"An online collection of tutorials, sample code, standards, and other resources <br> provided experts at IBM to assist software developers using open standards <b>...</b>".
I have this test case set up to pass its verification point.
The third test makes a call to the
doSpellingSuggestion() operation. It passes in the API
key and the
Rationla Performance Tester." For this test case, I again have a contain
verification point that looks for the resulting correct spelling. I have this test
case set up to pass its verification point.
With our three test cases set up, we can now look at some different types of test results.
The Test Log viewer is the best source for the details around a test. It's the first place to go for the results of a functional Web service test, and is the place to go for the details of a performance Web service test. The viewer has two views, Overview and Events. In the Overview view, you can find general information and common properties. General information includes the name, description, and relative file path of the test suite. Common properties include the start and stop times, the verdict for the test run, the type of test suite, and so on. This view also includes verdicts, which indicate the overall result of the test run. Table 1 illustrates the types of verdicts you might see.
Table 1. Possible verdicts and verdict icons
|Error||Indicates that the primary request was not successfully sent to the server, that no response was received from the server, or that the response was incomplete or could not be parsed.|
|Fail||Indicates that the verification point did not match the expected response or that the expected response was not received.|
|Inconclusive||Indicates that custom code provided that you provided defined a verdict of inconclusive.|
|Pass||Indicates that the verification point matched or received the expected response.|
In the Events view, you can find the details for your test run. The Events tree lists all test execution events, such as the script start and end, loop, invocation, message, or verdict. When you select an object in the Events tree, the properties of the selected event or object are displayed under Common Properties and Detailed Properties. Common Properties displays the time and text of a selected event in the Events tree. Clicking the name of the element under Properties opens the test suite, test case, or test behavior of the event that you selected in the Events tree. The Text field displays a message about the execution of the event that you selected in the Events tree. The Defects section is integrated with ClearQuest, allowing you to log or view defects associated with the selected element.
Let's take a closer look at the test suite for the Google Web API. Figure 2 illustrates an example Events tree from an automatically generated schedule.
Figure 2. Events tree for the Google Web API test suite
As you move through the tree (starting at the top and moving down), you can see that there is a verdict roll-up for each level in the tree; the Google API suite element shows the roll-up verdict, and each verification point shows its own verdict. The delay between each Web service call is automatically added; the default value is zero milliseconds. Finally, each call is listed with its corresponding request and verification point.
For the details of a verification point, you'll want to use the Web Service
Protocol view. In this view, you can see the details of the returned envelope and
the verification point. For some reason, this view isn't shown by default, so you
may need to open it by selecting Window > Show View > Other,
and then, in the Show View window, selecting Test > WS Protocol
Data. Figure 3 offers a look at our verification point for
Figure 3. Web Service Protocol view
You can also see the XML for a request or response Web service call in the Web
Service Protocol view. Figure 4 illustrates the XML for the
Figure 4. Response XML for doSpellingSuggestion()
Web service performance reports are useful for both summary information for functional Web service tests and for detailed information for performance Web service tests. What's great about these reports is your ability to customize the summary information that you can show as soon as the test ends.
The first report you see when your test runs is the Overall report; Figure 5 illustrates an example. This report, by default, shows the percentage of Web service calls that were successful, and the pass percentage for each type of verification point you have in your suite.
Figure 5. Overall Web service performance report
There are all sorts of cool things you can do with this report. To see what your options are, right-click anywhere on the report and select Add/Remove Performance Counters > Web Services Performance Counter... This will open the Add/Remove Web Services Performance Counter wizard, illustrated in Figure 6.
Figure 6. Add/Remove Web Services Performance Counter wizard
If you expand any of the counter categories, you'll see different types of counters, such as:
- Percent success
- Average, minimum, maximum, and standard deviation response time or connection time
- Total verification point counts
- Total contains and percent contains for verification points
Take a look at a specific example. If you expand Response Time, you can add a counter for the average response time for all Web service returns, as shown in Figure 7.
Figure 7. Adding an Average Response Time For All Returns counter
If you add that counter, the Overall report changes, as you can see in Figure 8.
Figure 8. Overall report with the addition of the Average Response Time For All Returns counter
In the example, you can see that the average response time for the Web service returns was 831.33 ms. To make sure that you are seeing the correct counter/value, you can switch over to the Response Time vs. Time Detail report (another tab along the bottom of the Web Service Performance Report view) and take a look at the response time for each Web service call, as shown in Figure 9.
Figure 9. Performance Summary table from the Response Time vs. Time Detail report
If you average those three numbers, you get 831.33. I often try to double-check a counter the first time I add it to a report. That's mostly because I don't trust myself, not because I don't believe the tool. The checkpoint lets me know that I'm doing what I really think I'm doing. I don't like making mistakes in my performance reports. For some reason, management always wants them to be accurate the first time.
You can add and remove counters from all the reports in the same way that you added the Average Response Time For All Returns counter. Open the Add/Remove Web Services Performance Counter wizard and look at the options you have for each report. Play around with adding different counters and see what happens. After you look at the data in a couple of different ways, you'll find out what works for you.
When you close the report you're working with, you'll be asked if you want to save your changes. If you want your changes to become part of the Web Services Performance Report going forward, go ahead and save. If you aren't sure that you always want that information, don't save, but just add it each time you need it (or create a new type of report).
Web Service Verification Point reports cover the details of the verification points in your test. While these reports don't add much in terms of new data, they do present nice summary views of the verification point information. In addition, if you have a large number of verification points in your suite, these reports can be helpful in locating the results quickly.
To show the Web Services Verification Point report, right-click on the log and select Web Services Reports > Web Services Verification Point Report. This will open up the report, starting on the Summary view, as shown in Figure 10.
Figure 10. Summary Web Services Verification Point report
The other views on this report show the details for the various verification point types in the suite. For example, the Return Contain Verification Points view shown in Figure 11 looks at the results for the two contains verification points in the suite.
Figure 11. Return Contain Verification Points view
You can add and remove counters from these reports in the same way as you did above in the Overall Web Service Performance report.
The more time you spend analyzing results, the more interest you'll have in learning some faster ways to find the information you need. The tips and tricks in this section may help you get where you need to go faster.
Often, you'll want to view two reports side by side. You may want to compare them, or you may just want to take in more information at once. To view two reports at a time (or any number or reports, actually; you can do this for as many reports as you like), simply click the title of the report, then drag the cursor to the left edge of the viewer area and dock it, as shown in Figure 12.
Figure 12. Dragging the report
The cursor changes to a black arrow. The reports for the two runs are displayed side by side, as shown in Figure 13.
Figure 13. Two reports side by side
By filtering the results that are displayed in a report, you can remove unnecessary data and focus on the data that is significant to you. As far as I can tell, you can filter on any report. Simply right-click on the report (just as you would to add a counter) and select Apply Filter. This will open the Performance Counter Filter dialog, shown in Figure 14.
Figure 14. Performance Counter Filter dialog
Here's a brief summary of the three options you have available:
- Filter by count: Displays the specified number of items. For example, if you select this option and then type 15, the report will show the 15 items with the highest or lowest values (depending on the radio button you select).
- Filter by value: Displays items based on a comparison with the specified value. For example, if you select this option and then type 15, the report will show all of the items that are higher or lower than 15 (depending on the radio button you select).
- Filter by label: Displays items that match the specified label. If you are filtering a table, the label is usually a page, and is listed in the left column. If you are filtering a graph, the label is a legend in the graph.
Another option, similar to filtering, is to narrow the time range of the test. To recalculate results for particular start and stop times, you can specify a time range for a report. You can enter custom start and stop times to filter out data from the ramp-up or ramp-down phases of test runs. This ability to focus on a specific time range enables you to see, for example, only the results from the period during which the maximum number of virtual users were making Web service calls. Perhaps that's when you really care about the pass/fail percentages of your verification points. The aggregated results are recomputed to take into account only the data collected during the specified time range.
In the performance report you want to change, right-click and select Change Time Range... This opens the Select Time Range dialog, shown in Figure 15.
Figure 15. Select Time Range dialog
Click New Time Range and add the new time range to the list of Available Time Ranges. Click Finish and the report refreshes, zooming the time axis to show data only from the specified time range. Aggregate results are recalculated to reflect only data from the selected time range. Note that the newly specified time range is stored with the report. To return to the complete report, right-click the report, select Change Time Range..., and then select Overall Time Range from the list of Available Time Ranges.
You can export test results to CSV, XML, or HTML. This can be useful for a number of reasons. Most often, you'll want to do this to aggregate your test data with data collected from another tool, or for archiving and reporting purposes.
You can export the entire results of a run or specific parts of the results to a CSV file for further analysis. To export results of a run:
- Choose File > Export.
- In the Export window, click Performance test run statistics, and then click Next.
- Type the name of a CSV file (with the .CSV extension), and then click Next.
- Select the run to export, and then click Next. The runs are listed in chronological order, with the most recent run at the bottom of the list.
- At this point, you can select the type of information that you want to export if you wish. Click Finish when you're done.
You can export test logs in XML format for further analysis. To export a test log in XML format:
- Choose File > Export.
- In the Export window, click Test execution history, and then click Next.
- In the Export Test Execution History window, browse to the folder where you want to store the log, and then click Next. If you enter only a file name, the log will be exported to the install folder.
- Select the log to export, and then click Finish.
You can export an entire report, or a tab on a report, to HTML format. To export a report to HTML:
- In the Performance Test runs view, right-click on the report or tab to export, and then select Export to HTML.
- In Specify file path for HTML exported file, select a folder to store the newly created report, and then click Next. Although your current project is the default, you would typically create a folder outside of the project to store exported reports.
- Click Finish.
- Optionally, you can paste the exported report into a spreadsheet program for further analysis.
Without going into the details here (because this is already a long article, and others know more about this part of reporting than I do), it's worth noting that you can also include information captured or imported for response time breakdown data and for resource monitoring data. Resource monitoring data consists of a sequence of observations collected at regular intervals. You can collect data in real time, or you can retrieve it from an IBM Tivoli Enterprise™ Monitoring Server. In addition to response time breakdown data, resource monitoring data provides you with a more complete view of a system that can aid in problem determination. Here are some of the kinds of data that you can collect and analyze:
- CPU usage (total, for individual processors, or even for individual processes)
- Available memory
- Disk usage
- TCP/IP and network throughput
This feature provides a more complete view of your web service to help isolate problems. You can monitor the system under test (or the agents) using IBM Tivoli Monitoring agents or Rational Performance Tester. To view resource monitoring data you can use the Profiling and Logging perspective of Eclipse.
Response time breakdown shows you how much time was spent in each part of the system under test code as the system was exercised. The response time breakdown view is associated with a Web service call from a particular execution of a test or schedule. You can use response time breakdown to do the following:
- Identify code problems
- See which application on which server is a performance bottleneck
- Drill down further to determine exactly which package, class, or method is causing a problem
To capture response time breakdown data, you must enable response time breakdown in a test or schedule, and configure the amount of data to be captured. The data collection infrastructure (something you'll see as you install the Rational Performance Tester tools) collects response time breakdown data. Each host on which the application runs, and from which you want to collect data, must have the data collection infrastructure installed and running. In addition, you must configure (or instrument) each application server to use the data collection infrastructure.
Now that you know how to get into the reports and change them, take some time to play around with the various counters and reports available. Be sure to look at exporting the data to different formats (especially CSV). And for your large tests, practice filtering and changing your time range to see if you can remove some of the noise from your results.
- Visit the
Rational Tester for SOA Quality
resource page on developerWorks for technical articles, free tutorials, and more.
Introduction to IBM Rational Tester for SOA Quality V7.0 and IBM Rational Performance Tester Extension for SOA Quality V7.0
(developerWorks, Mar 2007): Get an overview of the functionality of these new
offerings and see a real example of testing a Web service.
- SOA and Web services: Visit the
SOA area on developerWorks
to get the resources you need to advance your knowledge and skills.
- Browse the
for books on these and other technical topics.
- "Introduction to IBM Rational Performance Tester V7.0,"
Michael Kelly (developerWorks, January 2007): Get an introduction to this product.
developerWorks Rational Performance Tester resource page:
See more dW resources on Performance Tester.
Get products and technologies
Google Web API WSDL: The
WSDL for the API used in this article.
- Get a trial download of
IBM Rational Tester for SOA Quality V7.0.
- Get a trial download of
IBM Rational Performance Tester V7.0.
- Get involved in the
developerWorks Rational community
by participating in the forums and more.