Use Rational Performance Tester to detect runtime parameterization issues
Apply Java-based custom code to simulate complex workloads
To adequately test performance, testers must be able to simulate large volume loads of two types:
- A large number (for example, 5000) concurrent users or active users
- High page throughput (for example, 100 pages per second)
For large volume tests, it's difficult to prepare test data and feed this test data to the automation tool at runtime. Performance test automation tools encounter two common problems related to runtime parameterization or data pooling:
- Tools cannot accurately log test data information for specific kinds of failures. Without customization, the tools cannot detect invalid test data or the source of the failure.
- Tools cannot easily read large volumes of user data from files. High throughput can create load generation problems because it results in excessive I/O for the load generator operating systems. An Excel spreadsheet that contains an excessive number of data entries (>65500) cannot be used to load test data at runtime.
IBM® Rational® Performance Tester includes the custom code feature, which addresses these problems. You are probably familiar with using custom codes to customize test scripts to control playback behavior and to adjust reporting mechanisms. Because the Java™-based custom code feature acts as a standalone Java program, you can use custom coding to address almost every customization requirement.
This article explains how to extend the capabilities of Rational Performance Tester to customize logs that report test data failures and use custom codes to enable runtime parameterization.
Detect and log invalid or erroneous test data
This article is based on the simple scenario shown in Figure 1, in which a
user logs into an application, visits his or her profile page, and logs
out. During test, the tester substitutes test values for
username
at runtime (for test purposes, the password is the
same for all users). The tester must find an effective means of creating
custom logs at runtime. The logs must include the total number of
username
values used during testing and the details for the
usernames that are able to log in successfully.
Figure 1. Test case scenario

Use custom code to pass arguments
A custom log file created at runtime needs to include each
username
value and information about the condition that
triggered its log-in success or failure.
The first step is to add custom code that logs the username
value into a file. This action is performed in the test script, which is
shown in Figure 2.
Figure 2. Test script

The test data for Home Page includes the details for
username
and password
, as shown in Figure 3.
Figure 3. Test data for Home Page

To create a custom log file, follow these steps:
- For
username
, specify a data pool for theusername
variable, in this caseUser_datapool
, as shown in Figure 4.
Figure 4. Data pool name in the "Substituted with" column

- To capture the username value with which the login transaction is
being performed, insert a custom code and pass the data pool value to
that code as an argument, using the following steps (as shown in
Figure 5).
- Select the secondary request that is below the primary request in bold.
- Click Insert.
- Select Custom Code.
Figure 5. Insert custom code to capture username value

- Pass arguments to custom code using the following steps, as show in
Figure 6.
- For Class name, type
customcode.Username_Logger
- Click Add.
- Select Username variable of User_Datapool datapool
- Click Select to pass the argument.
- For Class name, type
Figure 6. Pass arguments to custom code

- Click Generate code to generate the default custom code template, as shown in Figure 7.
Figure 7. Click Generate code

- Using the following steps, decide which username values to capture:
those who logged in successfully, or those who did not. Create an IF
condition to enable the custom code to capture the username values you
want to record. For example, if the keyword
Welcome
denotes the successful login of a user, add this keyword as the content to be searched in the response of the login request. Username values that match this condition are captured.- Select the secondary request that appears below the primary URL, as shown in Figure 8.
- Click Insert.
- Select Condition (If). Click Yes if prompted.
Figure 8. If condition addition to evaluate successful transactions

- The IF condition ensures that the custom code logs only those username
values that satisfy the IF condition. To define the condition, compare
two operands: the
Welcome
keyword and the response to the login request, which is searched for occurrences of theWelcome
keyword. Use the following steps to create a reference and add it to the IF condition.- Select the response of the primary request that is in bold.
- Right-click the response content, as shown in Figure 9.
- Select Create Field Reference, which creates
a reference for the entire response content.
Figure 9. Create Field Reference
- Name the reference
Content: Login_Response.
Click OK, as shown in Figure 10.Figure 10. Provide name for field reference
- Define the log file name and path as a global variable by following
these steps, as shown in Figure 11:
- Select the Test Variables entity in the test script.
- Click Add.
- Select Variable Declaration.
Figure 11. Add File name and path as a global variable
- Type
FileName
for the Variable Name.Figure 12. Variable name
- Initialize the value of this variable to the file name of your log as shown in Figure 13. In this case, the file name is C:\\Valid_Data_Log.txt. (Note that there two back slashes to avoid the special character \. When you specify a file name on Unix operating systems, you do not need the extra back slash character because the file separator character used in Unix operating system is a forward slash (/).
Figure 13. Initialize the variable

- Go back to the custom code in the script and add this as a second argument, as shown in Figure 14.
Figure 14. Add filename as argument to custom code

- Insert the code in Listing 1 in the file Username_Logger.java.
Listing 1. Custom code for logging username values
import java.io.BufferedWriter; import java.io.FileWriter; import java.io.IOException; BufferedWriter out=null; FileWriter fstream; try { fstream = new FileWriter(args[1], true); //true signifies data append instead of overwrite out = new BufferedWriter(fstream); out.write(System.getProperty("line.separator")); //inserts a new line in the file out.write(args[0]); // writing the username value into the file } catch (IOException e) { e.printStackTrace(); } finally { if(out != null) { try { out.close(); } catch (IOException e) { e.printStackTrace(); } } }
Now the script is ready to log all of the username values that are valid in the log file Valid_Data_Log.txt file in the C:\ directory.
Extensions of this method
You can use this method for other purposes. For example you can log the invalid or erroneous username values by selecting the option Negate the operator (NOT (op)) in the IF condition. By changing the IF condition you can control the data that gets logged.
You can also customize this method to validate other test data and to test for other conditions, according to your particular requirements.
Custom code to enable runtime parameterization
You can use Java custom code to pass data pool values at runtime from an external file. This method solves many existing problems:
- Uneven split of the data pool across agents and difficulty sharing the same data across agents (in shared and segmented mode).
- Inability to customize the load simulation using the method of controlling what data needs to be on which agent machines.
- Issues with how Excel spreadsheets handle large volumes of data
- Excessive I/O from high-volume loads through traditional methods of data pooling.
With custom code, you substitute the value of the username used in the previous scenario with values from an external file, rather than using the traditional data pool feature included in Rational Performance Tester.
Use the code in Listing 2 to implement this method. This custom code reads the values from a file. You can customize this code to suit your requirements.
Listing 2. Custom code to manually carry out runtime parameterization
int total_users_per_agent=<<xxxx>>; String datapool_var=null; int current_user = Integer.parseInt(args[0]); //args[0] is the Username datapool that will be passed to this custom code try { ArrayList<String> datapool_array = new ArrayList<String>(); Scanner s = new Scanner(new File("C:\\Datapool.csv")); while(s.hasNext()) { datapool_array.add(s.next()); } s.close(); //applying mod function here... int user_cnt= current_user- (total_users_per_agent*((current_user/total_users_per_agent))); String[] desired_column_value_datapool = datapool_array.get(user_cnt).split(","); datapool_var = desired_column_value_datapool[<<index_of_column>>]; } catch (FileNotFoundException e) { e.printStackTrace(); } return datapool_var; //datapool_var is the desired row-column value of the datapool
The variable total_users_per_agent
denotes the number of
virtual users running the test script that contains this custom code on an
agent machine.
Note: In this context, the agent machine is a physical machine and not a single instance (JVM) of the Rational Performance Tester load driver agent.
The variables total_users_per_agent
and
current_user
(derived from the username data pool) is used to
compute the row index in the data pool file. The contents of this row are
stored in the datapool_array
string variable array. The Java
split function is used to split the row contents based on a comma
separator to derive individual values. This custom code requires the data
pool to be in CSV format.
The step-by-step method to create custom code and pass arguments can be used to implement custom code to enable runtime parameterization.
Issues to be aware of
Use the custom code logging method when the size of data written into files are small. Do not use this method to log the entire responses of requests, because this action results in heavy I/O operations and causes performance degradation resulting from high CPU waits from disk I/O latency and from a long queue of I/O processes.
Performance degradation caused by these actions or by related issues such as a larger memory footprint incurred by the use of many large custom codes or high memory consumption, leads to degradation in load generation capability. High CPU usage on load generation hardware and poor response times reported by the performance test tool are directly related because the test tool is sometimes in a hung situation resulting from high CPU usage. Because the test tool reports response times after adding its own latency in receiving the response, the performance measurement is skewed.
Monitor the utilization of your load generation hardware. Monitor particularly the processor utilization of both workbench and agent machines while running the performance tests. It is helpful to monitor the memory use by the Java process during simulation of a high volume of concurrent users in the system. The utilization of TCP/IP sockets also affects the performance of load generation resulting from a high volume of request generations on the server. Disk I/O utilization must be monitored and checked for disk latencies that result from excessive logging or levels of logging configured in the test schedule.
Ensure that CPU utilization numbers do not exceed a certain threshold limit. Different kinds of application testing demand different thresholds for load generation hardware utilization. In most situations, do not allow the load generation environment to exceed 70%. Set the threshold at 70% of CPU, memory, disk, network, or TCP/IP sockets.
Aim to build your custom codes to keep the performance of the automation tool during load generation within an acceptable limit.
Downloadable resources
Related topics
- Find out more on the Rational Performance Tester product overview page. Then explore the Rational Performance Tester page on IBM developerWorks for links to technical articles and browse the user assistance in the Rational Performance Tester Information Center.
- Learn how to enhance the load generation capability of Rational Performance Tester in the developerWorks article "Optimize load handling for high-volume tests with IBM Rational Performance Tester" (developerWorks, May 2011).
- Explore runtime parameterization using custom codes in the developerWorks article, "Faster processing of numerous test data parameters for performance tests" (developerWorks, July 2011).
- Explore the Rational software area on developerWorks for technical resources, best practices, and information about Rational collaborative and integrated solutions for software and systems delivery.
- Stay current with developerWorks technical events and webcasts focused on a variety
of IBM products and IT industry topics.
- Attend a free developerWorks Live! briefing to get up-to-speed quickly on IBM products and tools, as well as IT industry trends.
- Watch developerWorks on-demand demos, ranging from product installation and setup demos for beginners to advanced functionality for experienced developers.
- Improve your skills. Check the Rational training and certification catalog, which includes many types of courses on a wide range of topics. You can take some of them anywhere, anytime, and many of the Getting Started ones are free.
- Download the trial version of IBM Rational Performance Tester.
- Evaluate IBM software in the way that suits you best: Download it for a trial, try it online, use it in a cloud environment, or spend a few hours in the SOA Sandbox learning how to implement service-oriented architecture efficiently.