Contents


Load testing Web applications using IBM Rational Performance Tester

Part 5. Customize, export, and compare reports

Comments

Content series:

This content is part # of # in the series: Load testing Web applications using IBM Rational Performance Tester

Stay tuned for additional content in this series.

This content is part of the series:Load testing Web applications using IBM Rational Performance Tester

Stay tuned for additional content in this series.

Before you start

Learn what to expect from this tutorial and how to get the most out of it.

About this series

IBM® Rational® Performance Tester is a performance testing tool that emulates various user loads to mimic the real-life loads. With proper planning coupled with realistic simulation, this tool uses the current loads to estimate future loads. For example, a customer's application may potentially serve a total of 5000 users. With Rational Performance Tester, you can easily emulate the user loads at 1000, 2000, 3000, 4000, 5000 and beyond to project the right user growth, so that you can also project server sizing, such as optimal CPU and memory requirements, more accurately. You can identify and diagnose performance bottlenecks, whether such problems occur in the network, database, the application server, or even the user application. The root cause analysis capability further analyzes application tiers, which may include page components such as Enterprise Java™Beans (EJBs), servlets, a Java™ Database Connector (JDBC) API, Web services, and so forth. This functionality enables you to pinpoint the performance culprit easily and efficiently by analyzing the online or extracted reports.

Rational Performance Tester also helps you create, run, and analyze performance tests and validate the scalability and reliability of your Web-based applications before deployment. The default supported protocols, such as HTTP and HTTPS, allow You to run the load tests on Web applications. Several extensions are also available:

  • IBM® Rational® Performance Tester Extension for Citrix Presentation Server
  • IBM® Rational® Performance Tester Extension for SOA Quality
  • IBM® Rational® Performance Tester Extension for Siebel Test Automation
  • IBM® Rational® Performance Tester Extension for SAP Solutions

Here's a quick summary of this series of five articles:

  • Part 1 gives you an overview of IBM Rational Performance Tester Version 7.0.
  • Part 2 walks you through the basics of using Rational Performance Tester by creating, running, and evaluating a simple test.
  • Part 3 covers testing as user loads grow (see the next section for more).
  • Part 4 is all about reports, because a load test is only as good as the reports of the results.
  • Part 5 (this part) shows you additional reports, as well as how you can customize and export the reports to suit your needs.

The goal of this series is to help you understand the features, topological considerations, and constraints so that you can create and test Web applications and analyze the performance reports. With this knowledge and the ease of use of Rational Performance Tester, load testing a Web application will no longer be a burdensome chore, and you can include it for each iteration of your software.

About this tutorial

This tutorial assumes that you have completed Parts 1-4 of this series. This tutorial explores the various other performance analysis reports provided by Rational Performance Tester. It also highlights some features related to using these analysis reports, such as navigation and customization of the reports.

Prerequisites

Be sure to work through Parts 1 through 4 before you start this article, because you use the same sample applications. It's important that you have learned the basics of using Rational Performance Tester for load testing from the other articles in this series, so that you can proceed to the more complex activities in this one.

Note:
The workbench machine should be used only for workbench activity, such as creating tests and distributing the performance load to run on remote machines.

Please ensure that your system meets these prerequisites:

Table 1. Required resources
ResourceWorkbench machineRemote machines
HardwareMinimum 1GB, more if running testMinimum 1GB
Software IBM Rational Performance Tester (includes IBM Rational Agent Controller)
IBM Rational License Server
IBM Rational License Server
IBM Rational Agent Controller
LicensesActivation kit for Rational Performance Tester to enable permanent use

Floating license key imported into Rational License Server

Note: The floating license key must be for more than or same number of virtual users that will test in Rational Performance Tester*

Pointing to floating license key served by workbench machine
NetworkAble to ping all remote machinesAble to ping workbench machine

*The trial version of Rational Performance Tester allows only five concurrent tests of users. To test more than that, you need to purchase the license. The IBM® Rational® Software Delivery Platform V7.0 - Desktop Product Activation site has information about how to get licenses and the activation process. You can download both IBM® Rational® Agent Controller and IBM® Rational® License Server from the IBM Software Access Catalog. See Related topics for links.

IBM Rational License Server manages floating and named-user license keys for Rational products. The floating license key is required if you want to run more than five virtual user tests. In this example, the license key is imported into the license server, which resides on the workbench machine and serves the key to all remote machines. The remote machines point to the license server.

The IBM Rational Agent Controller needs to be installed on all remote machines, to enable distributed testing. The workbench machine would have the Rational Agent Controller installed when installing Rational Performance Tester.

Figure 1 shows the setup that you need for the exercises in this article.

Figure 1. Topology of the setup for remote testing
Topology of the setup for remote testing
Topology of the setup for remote testing

Performance reporting in Rational Performance Tester

The saying "a picture is worth a thousand words" is applicable to the various analysis reports that come with IBM® Rational® Performance Tester. These reports, complete with easy-to-use features, not only enhance your visual experience of test results, but also enable you to identity application bottlenecks rather easily. The purpose of these reports is to enable you to run as many performance tests as possible during the test phase, with as little burden as possible incurred from using the tool.

Part 4 of this series discussed the following:

  • The performance testing process
  • An introduction to analysis reports and counters
  • How to navigate to reports
  • Report term definitions
  • Detailed information about the Performance and Page Element reports

Overview

The emphasis of this tutorial is to showcase the feature-rich reporting capabilities built into Rational Performance Tester. The examples use the standard test application DayTrade as the application under test. Some of the reports captured here are purely for illustrative purposes.

Part 4 of this series discussed the Performance and Page Element reports in detail. This tutorial discusses the following report types, which are included in Rational Performance Tester:

  • Percentile report
  • Verification report
  • Transaction report

Citrix and SAP reports are outside the scope of this tutorial, and therefore will not be discussed.

This tutorial will first drill down to the types of reports that are readily available in Rational Performance Tester. Subsequently, it will look at the available features surrounding the reporting capabilities, such as navigation, customization, and report export.

The reporting features can be divided into the following categories:

  • Remote Analysis. This feature allows you to examine the analysis reports offline. You can import the entire project to run offline (for example, on your laptop when you travel). You do not need to have access to the test center to be able to analyze reports.
  • Report Comparison. You can compare performance analysis reports from different runs. Each run (with a different load) yields different results that can be compared for trend analysis. An easy way to do a quick report comparison is provided online, because each run is automatically saved with a timestamp. To compare reports, you can do so side by side online, or export the reports (see the following bullets).
  • Report Customization. This capability allows you to customize the reports that are of most concern to you. Though Rational Performance Tester provides several reports, you may only need specific ones in order to pinpoint a performance bottleneck. In this case, customizing reports gives you exactly the information that you want.
  • Exporting Reports. You can export performance analysis reports to HTML, CSV, and XML formats. You might do this to present to third-party or higher management stakeholders (because the formats are portable), or to compare reports.

The remainder of this tutorial explores each topic in more details.

Types of reports

Rational Performance Tester provides application- and resource-level analysis reports. Application analysis reports have everything to do with application system performance related issues, such as page or page element response time, page hit, page or page element throughput, and so on. On the other hand, resource analysis reports relate system resources utilization such as CPU, memory, disk, and network throughput. Currently, there are three resource monitoring data collected:

  • IBM® Tivoli® Monitoring (ITM)
  • Microsoft® Windows® Performance Monitor (PerfMon)
  • Linux®/UNIX®rstatd

In Rational Performance Tester, the performance test reports that are provided are HTTP, SAP, and Citrix, of which the latter two require extensions (SAP and Citrix reports will not be discussed in this tutorial). HTTP performance reports, which will be discussed in greater detail in the following sections, come in the different categories that are summarized in Table 1.

Table 1. Summary of additional HTTP Analysis reports in Rational Performance Tester
Report CategoryReport Tab NameDescription
Percentile Report Summary The percentile report graphs show response time versus page counter for response time distribution for all pages in three types of percentile: 85th, 90th, and 95th.
Verification Report Summary A line graph representing the summary of percent page verification points passed for all pages, with each point referring to an interval.
Page Verification Points A table showing the information of page verification points' pass, fail, and percent pass on a per page basis
Page Element Verification Points This report shows the element verification point pass count, fail count, and percent pass.
Transaction Report Overall Transaction This report presents the average execution time for all transactions, with each point representing an interval.
Duration vs. Time A line graph showing the details of each page or element that contributed to the transaction.
Transaction Throughput The first graph shows the transaction start and complete rate measured in seconds. The second graph shows the user load progress as the test runs.

Additional analysis reports

This section will explain performance reports (although many of them are self explanatory) by focusing on and explaining the content of each report. In order to pinpoint performance bottlenecks, analyzing the right reports is essential. You will be viewing the default report presentations in this section; the customizations of these reports are discussed in a later section.

Percentile Report

The Percentile Report - Summary graphs show response time versus page counter for response time distribution for all pages in three percentiles: 85th, 90th, and 95th. You can apply filters to the graph. The 85th percentile bar indicates that 85% of all users achieved the response time or better. The 90th percentile bar indicates that 90% of all users achieved the indicated response time or better. Likewise, for the 95 percentile bar, 95% of all users achieved the indicated response time or better. The following bullets describe the Performance Summary table shown in Figure 1:

  • The first left column displays the page name.
  • Response Time - Minimum for Run. This is the fastest response time for that page.
  • Response Time - Average for Run. This adds up all of the average response times for that page.
  • Response Time - Maximum for Run. This is the slowest response time for that page.
  • Response Time - Standard Deviation for Run. This is the deviation from the mean. A bigger deviation number means there was less consistency in the response times, and a smaller deviation number means more consistency.
  • Response Time - 85th Percentile. 85% of that page response time was equal or faster than the time shown.
  • Response Time - 90th Percentile. 90% of that page response time was equal or faster than the time shown.
  • Response Time - 95th Percentile. 95% of that page response time was equal or faster than the time shown.
  • Attempts - Rate for Run. In one second, how many times that page was sent to the server.
Figure 1. Percentile Report: Summary
bar graph above data table
bar graph above data table

Verification Point Report

Verification points ensure that the expected result can be delivered for a particular page or page element. Unexpected returns during a test run will be reported as a failed verification point. There are four types of verification points:

  • Page title. Page title is a case-sensitive comparison that ignores spaces between words. This verification works for the primary page.
    • To enable the page title verification point, highlight the primary page and select the Enable Verification Point box at the bottom right corner. Alternatively, right-click the primary page to select Verification Points > Enable Page Title VPs.
    • To turn on page title verification points for all pages, select and enable them from the highest hierarchy in the test contents.
  • Content. Content verification applies string matching to ensure that an expected string is found. A failure is defined as at least one of the search strings is found or none of the search strings are found. You can enable content verification points for all of the elements by selecting a test element in a higher hierarchy, or simply select an individual element to be enabled. In other words, enabling at the test, page, and page element level is supported.
  • Response Code. Response code verification point sets the passing criteria using the returned response code (set at the test, page, or page element level). You can set the response code matching method to either Relaxed or Exact. A relaxed match is a match that gives a pass for response codes that fall in the same category, while an exact match will give a fail if the response code does not exactly match the specified code.
  • Size. A failed response size verification point is defined as the verification point that fails the size matching test. Again, you can enable the verification point at the test, primary page, and page element level. For the primary page, the default matching method is range (bytes and %), while page element uses exact matching method as the default. Other matching methods include At Least and At Most.

The Verification Point - Summary report, shown in Figure 2, gives you a summary of the aforementioned verification points. It is a line graph representing the summary of percent page verification points passed for all pages, with each point referring to an interval. The table gives information such as the sum total page verification points attempted, passed, and failed.

Figure 2. Verification Point Report: Summary
line graph above data table
line graph above data table

The Page Verification Points report, shown in Figure 3, is a table showing information for page verification points' pass, fail, and percent pass on a per-page basis. Note that only primary pages that are enabled with verification points will be shown here.

Figure 3. Verification Point Report: Page Verification Points
data table
data table

The Page Element Verification Points report, shown in Figure 4, displays the pass count, fail count, and percent pass.

Figure 4. Verification Point Report: Page Element Verification Points
table showing pass count, fail count, and % pass
table showing pass count, fail count, and % pass

Transaction Report

A transaction is a collection of elements (both primary pages and page elements) that can be gathered for better performance analysis. To select elements that constitute a transaction, perform a multiple select under the Test Contents panel, as shown in Figure 5. The transaction elements may not be (and often are not) sequential in the order listed in the Test Contents panel. In other words, you can pick any element to make up a transaction. Usually, you gather a few test elements whose performance you are interested in to make up a transaction.

Figure 5. Adding a transaction
pop-up menu command
pop-up menu command

The Transaction Report consists of three default reports: Overall transaction rate, Transaction Duration vs. Time, and Transaction Throughput.

The Overall Transaction report, shown in Figure 6, presents the average execution time for all transactions, with each point representing an interval. The table shows the execution time standard deviation, minimum time, and maximum time for all transactions.

Figure 6. Transaction Report: Overall
line graph above data table
line graph above data table

The Duration vs. Time Transaction Report, shown in Figure 7, may consist of an individual page or page element within a test. This report shows the average execution time, with each point representing an interval. Each transaction is shown as an individual line in the graph. The table provides the following information related to each transaction:

  • Minimum Execution Time. This is the minimum execution time for the run.
  • Maximum Execution Time. This is the maximum execution time for the run.
  • Average Execution Time. This is the average execution time for the run.
  • Execution Time Standard Deviation. This is the deviation from the mean for the run.
  • Rate of Completion. This is the transaction completion rate measured in seconds.
  • Total Attempts. This is the total attempts made by the transaction for the run.
Figure 7. Transaction Report: Duration vs. Time
overlapping line graphs above data table
overlapping line graphs above data table

The Transaction Throughput report, shown in Figure 8, includes line graphs and two tables. The left graph shows the transaction start and complete rate measured in seconds, with each point representing an interval. The left table provides the transaction completion rate measured in seconds, and also the completion count for the run. The right graph shows the user load being added (active users), and the number of users who completed the run (completed users). The table on the right gives the count for active, completed, and total users.

Figure 8. Transaction Report: Transaction Throughput
two graphs side by side
two graphs side by side

How to customize reports

By now, you have seen the default reports in various categories, all of which are available to help you quickly diagnose potential performance bottlenecks. With Rational Performance Tester, you can further customize the reports for a specific view, compare the reports online, or export them in CSV, HTML, or XML formats. Note that all customizations are based on the generic performance counters provided by Rational Performance Tester, because the counters are the basic building blocks for performance reports.

You can customize existing reports by changing the look and feel, adding more performance counters, extending the time range, and so on. You can also customize (create) a report from scratch using the Manage Reports option from the Performance Test Runs view. This tutorial will first show you how to customize a default report, and then how to create a report.

Customize an existing report

Using the Performance Report (the Page Performance tab) as an example, you will modify its look and feel. The following steps show you some of the customization features available.

  1. Right-click the Page Performance report and select the Customize option, as shown in Figure 9.
    Figure 9. Default Performance Report customization: Step 1
    bar graph above data table
    bar graph above data table
  2. Change the Y-axis label to Average Response Time in ms, select Use 3D Bars, and change the pixel Height to 500, as shown in Figure 10. After you click OK, the report will reflect the changes accordingly.
    Figure 10. Default Performance Report customization: Step 2
    Graphic Configuration dialog options
  3. Because the customization option is context sensitive, you can hover over the table below the graph, and then right-click.
  4. Next, select from options such as Sort Labels, Summary Table, Sort Columns, Invert Table Rows/Columns, and so on, as shown in Figure 11.
    Figure 11. Default Performance Report customization: Step 3
    options dialog on bar graph
    options dialog on bar graph
  5. You can also add a performance counter to the existing report. However, the counters you add should make logical sense. At the same time, they should not fall outside the drafting range. In this scenario, right-click and select Add/Remove Performance Counters > Page Performance Counter.
  6. Select the counter Maximum [for Run] under Response Time [ms] > All Pages, as shown in Figure 12.
    Figure 12. Default Performance Report customization: Step 4
    tree view of available counters
    tree view of available counters
  7. After you click Finish, you should be presented with a new graph that includes maximum response time for the run, as shown in Figure 13. You can add as many counters as you need, so long as they are logically grouped and do not obscure the information presented in the report.
    Figure 13. Default Performance Report Customization: Step 4
    bar graph above data table
    bar graph above data table

Create a new report

The following steps show you how to create a report from scratch.

  1. From the Performance Test Runs view, right-click and select Manage Reports, as shown in Figure 14.
    Figure 14. Defining a new report: Step 1
    menu command
    menu command
  2. Click Create in the Select Report window.
  3. Enter a name for the new report that you want to create (in this example, DayTrade Verification, as shown in Figure 15. Click the Insert button to insert at least one tab. Note that a report can contain as many tabs as needed. For this exercise, you will only create one tab report containing two graphs.
    Figure 15. Defining a new report: Step 2
    name report and insert report tab
    name report and insert report tab
  4. Enter a name for the new tab (for example, Page Verification for DayTrade).
  5. Select Custom Tab (2 graphics), and select the available boxes, as shown in Figure 16. Click Next to continue.
    Figure 16. Defining a new report: Step 3
    specify tab properties
    specify tab properties
  6. Give the first graphic a title (for example, DayTrade - Percent Page VPs Passed for Interval). Choose Line Chart as the graphic type, as shown in Figure 17. At this point, you could customize (by clicking the Customize button) the X and Y axis, or add a filter (by clicking the Add button) to the graphics.
  7. Click Next to continue to the performance counter selection.
    Figure 17. Defining a new report: Step 4
    select graphic type and filters
    select graphic type and filters
  8. Because this tab is to show the verification points for all pages, choose Generic Counters > Pages > Verification Points > Percent Page VPs Passed [for Interval], as shown in Figure 18.
  9. Click the Add button to include the counter that you selected.
  10. Click Next to continue to proceed to the second graph.
    Figure 18. Defining a new report: Step 5
    create data sets and filters
    create data sets and filters
  11. Repeat the process for the second graph, but use the following options instead:
    • Title: DayTrade - Total Page VPs Attempted for Interval
    • Graphic Type: Line chart
    • Generic Counter: Total Page VPs Attempted [for Interval]
  12. You can choose to include more tabs, with each tab showing different performance counters. In this case, it suffices to create one tab, as shown in Figure 19. Click Finish to complete the report.
    Figure 19. Defining a new report: Step 6
    name report and insert report tab
    name report and insert report tab
  13. To display the report you just created, right-click and select Display Report. Choose to display the DayTrade Verification report, as shown in Figure 20.
    Figure 20. Defining a new report: Step 7
    two line graphs side by side
    two line graphs side by side

How to export and compare reports

Export performance test run statistics

All performance-related reports can be viewed online, but there is also an option for you to export performance test run statistics to a file. Individual performance reports can be exported to CSV, HTML, and XML formats. The following steps show you how to export performance test run statistics to an HTML file.

  1. Right click in the Test Navigator view and select Export, as shown in Figure 21. Alternatively, select File > Export from the menu.
    Figure 21. Exporting Performance Test Run Statistics: Step 1
    menu command
  2. Note that the Export wizard is used for other purposes as well, in addition to exporting performance related statistics for further analysis. For example, in the absence of a source control system, you can export a performance test project from a workspace to an archive file to be shared with others. In this scenario, you will confine your export to performance related statistics after the test run, as shown in Figure 22.
    Figure 22. Exporting Performance Test Run Statistics: Step 2
    select an export destination
    select an export destination
  3. Browse to the current project, and create a folder (for example, ExportedStats) and name the new file DTStats, as shown in Figure 23.
    Figure 23. Exporting Performance Test Run Statistics: Step 3
    select the CSV file
    select the CSV file
  4. Pick a test run to export, as shown in Figure 24. You can pick as many test runs to export as you like, but in this case just pick one test run to export.
    Figure 24. Exporting Performance Test Run Statistics: Step 4
    select the performance test run
    select the performance test run
  5. Statistics that are available for export can be divided into three main categories, as shown in Figure 25:
    • Pages
    • Page Elements
    • Miscellaneous

You can choose to export overall results, and the time since the start of the run. Once you click the Finish button, the reports will be generated in the directory that you specified previously.

Figure 25. Exporting Performance Test Run Statistics: Step 5
select statistics to include in export
select statistics to include in export

Export response time breakdown statistics

Other than the usual performance statistics export, you can also quickly export response time breakdown statistics for a test run.

  1. From the Page Performance tab of the Performance Report, navigate to the response time breakdown, right-click and select Report.
  2. Choose the option (for example, Report in HTML format, as shown in Figure 26).
    Figure 26. Response Time Breakdown Statistics Export: Step 1
    choose CSV, HTML, or XML format
    choose CSV, HTML, or XML format
  3. Specify a directory and filename (for example, ExportedStats and DTResponseTime, respectively). Once you click the Finish button, an HTML file containing the response time breakdown will be generated, as shown in Figure 27. You can generate the same statistics in CSV and XML format.

Compare reports

In order to compare the performance statistics side-by-side, the simplest way is to use the CSV format for comparisons, because this allows results from many test runs to be compared at one time.

The advantage of exporting statistics to XML format is that, by applying XSLT, you can further customize reports for better graphical presentation. When you apply XSLT, you can also store reports in non-editable format (such as PDF format).

Figure 27. Response Time Breakdown Statistics Export: Step 2
exported report
exported report

Conclusion

This latest part of the series looked at various capabilities provided by IBM Rational Performance Tester to generate default performance analysis reports. It also showed you how you can customize the reports to suit your needs.


Downloadable resources


Related topics


Comments

Sign in or register to add and subscribe to comments.

static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Rational
ArticleID=382805
ArticleTitle=Load testing Web applications using IBM Rational Performance Tester: Part 5. Customize, export, and compare reports
publish-date=04212009