Contents


Use Rational Performance Tester to detect runtime parameterization issues

Apply Java-based custom code to simulate complex workloads

Comments

To adequately test performance, testers must be able to simulate large volume loads of two types:

  • A large number (for example, 5000) concurrent users or active users
  • High page throughput (for example, 100 pages per second)

For large volume tests, it's difficult to prepare test data and feed this test data to the automation tool at runtime. Performance test automation tools encounter two common problems related to runtime parameterization or data pooling:

  • Tools cannot accurately log test data information for specific kinds of failures. Without customization, the tools cannot detect invalid test data or the source of the failure.
  • Tools cannot easily read large volumes of user data from files. High throughput can create load generation problems because it results in excessive I/O for the load generator operating systems. An Excel spreadsheet that contains an excessive number of data entries (>65500) cannot be used to load test data at runtime.

IBM® Rational® Performance Tester includes the custom code feature, which addresses these problems. You are probably familiar with using custom codes to customize test scripts to control playback behavior and to adjust reporting mechanisms. Because the Java™-based custom code feature acts as a standalone Java program, you can use custom coding to address almost every customization requirement.

This article explains how to extend the capabilities of Rational Performance Tester to customize logs that report test data failures and use custom codes to enable runtime parameterization.

Detect and log invalid or erroneous test data

This article is based on the simple scenario shown in Figure 1, in which a user logs into an application, visits his or her profile page, and logs out. During test, the tester substitutes test values for username at runtime (for test purposes, the password is the same for all users). The tester must find an effective means of creating custom logs at runtime. The logs must include the total number of username values used during testing and the details for the usernames that are able to log in successfully.

Figure 1. Test case scenario
Use case diagram for scenarios used article

Use custom code to pass arguments

A custom log file created at runtime needs to include each username value and information about the condition that triggered its log-in success or failure.

The first step is to add custom code that logs the username value into a file. This action is performed in the test script, which is shown in Figure 2.

Figure 2. Test script
List of pages in the Login_Testcase directory

The test data for Home Page includes the details for username and password, as shown in Figure 3.

Figure 3. Test data for Home Page
Username and password test data for home page
Username and password test data for home page

To create a custom log file, follow these steps:

  1. For username, specify a data pool for the username variable, in this case User_datapool, as shown in Figure 4.
Figure 4. Data pool name in the "Substituted with" column
Username value substituted with User_Datapool
Username value substituted with User_Datapool
  1. To capture the username value with which the login transaction is being performed, insert a custom code and pass the data pool value to that code as an argument, using the following steps (as shown in Figure 5).
    1. Select the secondary request that is below the primary request in bold.
    2. Click Insert.
    3. Select Custom Code.
Figure 5. Insert custom code to capture username value
Select request, click Insert, choose Custom Code
Select request, click Insert, choose Custom Code
  1. Pass arguments to custom code using the following steps, as show in Figure 6.
    1. For Class name, type customcode.Username_Logger
    2. Click Add.
    3. Select Username variable of User_Datapool datapool
    4. Click Select to pass the argument.
Figure 6. Pass arguments to custom code
Choose custom code to receive arguments
Choose custom code to receive arguments
  1. Click Generate code to generate the default custom code template, as shown in Figure 7.
Figure 7. Click Generate code
Screen capture showing Generate Code button
Screen capture showing Generate Code button
  1. Using the following steps, decide which username values to capture: those who logged in successfully, or those who did not. Create an IF condition to enable the custom code to capture the username values you want to record. For example, if the keyword Welcome denotes the successful login of a user, add this keyword as the content to be searched in the response of the login request. Username values that match this condition are captured.
    1. Select the secondary request that appears below the primary URL, as shown in Figure 8.
    2. Click Insert.
    3. Select Condition (If). Click Yes if prompted.
Figure 8. If condition addition to evaluate successful transactions
if condition
if condition
  1. The IF condition ensures that the custom code logs only those username values that satisfy the IF condition. To define the condition, compare two operands: the Welcome keyword and the response to the login request, which is searched for occurrences of the Welcome keyword. Use the following steps to create a reference and add it to the IF condition.
    1. Select the response of the primary request that is in bold.
    2. Right-click the response content, as shown in Figure 9.
    3. Select Create Field Reference, which creates a reference for the entire response content.
      Figure 9. Create Field Reference
      Login response and Create Field Reference
      Login response and Create Field Reference
    4. Name the reference Content: Login_Response. Click OK, as shown in Figure 10.
      Figure 10. Provide name for field reference
      Edit Reference Name dialog box
      Edit Reference Name dialog box
  1. Define the log file name and path as a global variable by following these steps, as shown in Figure 11:
    1. Select the Test Variables entity in the test script.
    2. Click Add.
    3. Select Variable Declaration.
      Figure 11. Add File name and path as a global variable
      Select Test Variables, then Add Variable Declaration
      Select Test Variables, then Add Variable Declaration
    4. Type FileName for the Variable Name.
      Figure 12. Variable name
      Variable Name field shows FileName
      Variable Name field shows FileName
    5. Initialize the value of this variable to the file name of your log as shown in Figure 13. In this case, the file name is C:\\Valid_Data_Log.txt. (Note that there two back slashes to avoid the special character \. When you specify a file name on Unix operating systems, you do not need the extra back slash character because the file separator character used in Unix operating system is a forward slash (/).
Figure 13. Initialize the variable
C:\\Valid_Data_Log.txt in Text field
C:\\Valid_Data_Log.txt in Text field
  1. Go back to the custom code in the script and add this as a second argument, as shown in Figure 14.
Figure 14. Add filename as argument to custom code
Specify the FileName variable
Specify the FileName variable
  1. Insert the code in Listing 1 in the file Username_Logger.java.
Listing 1. Custom code for logging username values
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
BufferedWriter out=null;
FileWriter fstream;
   try {
      fstream = new FileWriter(args[1], true); //true signifies data append instead of overwrite
      out = new BufferedWriter(fstream);
      out.write(System.getProperty("line.separator")); //inserts a new line in the file
      out.write(args[0]); // writing the username value into the file
   } 
   catch (IOException e) {
      e.printStackTrace();
   }
   finally
   {
      if(out != null) {
        try {
            out.close();
         } catch (IOException e) {
            e.printStackTrace();
         }
      }
}

Now the script is ready to log all of the username values that are valid in the log file Valid_Data_Log.txt file in the C:\ directory.

Extensions of this method

You can use this method for other purposes. For example you can log the invalid or erroneous username values by selecting the option Negate the operator (NOT (op)) in the IF condition. By changing the IF condition you can control the data that gets logged.

You can also customize this method to validate other test data and to test for other conditions, according to your particular requirements.

Custom code to enable runtime parameterization

You can use Java custom code to pass data pool values at runtime from an external file. This method solves many existing problems:

  • Uneven split of the data pool across agents and difficulty sharing the same data across agents (in shared and segmented mode).
  • Inability to customize the load simulation using the method of controlling what data needs to be on which agent machines.
  • Issues with how Excel spreadsheets handle large volumes of data
  • Excessive I/O from high-volume loads through traditional methods of data pooling.

With custom code, you substitute the value of the username used in the previous scenario with values from an external file, rather than using the traditional data pool feature included in Rational Performance Tester.

Use the code in Listing 2 to implement this method. This custom code reads the values from a file. You can customize this code to suit your requirements.

Listing 2. Custom code to manually carry out runtime parameterization
int total_users_per_agent=<<xxxx>>;
String datapool_var=null;
   
   int current_user = Integer.parseInt(args[0]);
   //args[0] is the Username datapool that will be passed to this    custom   code
               
   try {
   ArrayList<String> datapool_array = new ArrayList<String>();
   Scanner s = new Scanner(new File("C:\\Datapool.csv"));

   while(s.hasNext())
   {
      datapool_array.add(s.next());
   }
   s.close();
               
   //applying mod function here...
   int user_cnt= current_user-   (total_users_per_agent*((current_user/total_users_per_agent)));
               
   String[] desired_column_value_datapool =    datapool_array.get(user_cnt).split(",");

   datapool_var = desired_column_value_datapool[<<index_of_column>>];
   
   } catch (FileNotFoundException e) {
      e.printStackTrace();
   }
   return datapool_var;
   //datapool_var is the desired row-column value of the datapool

The variable total_users_per_agent denotes the number of virtual users running the test script that contains this custom code on an agent machine.

Note: In this context, the agent machine is a physical machine and not a single instance (JVM) of the Rational Performance Tester load driver agent.

The variables total_users_per_agent and current_user (derived from the username data pool) is used to compute the row index in the data pool file. The contents of this row are stored in the datapool_array string variable array. The Java split function is used to split the row contents based on a comma separator to derive individual values. This custom code requires the data pool to be in CSV format.

The step-by-step method to create custom code and pass arguments can be used to implement custom code to enable runtime parameterization.

Issues to be aware of

Use the custom code logging method when the size of data written into files are small. Do not use this method to log the entire responses of requests, because this action results in heavy I/O operations and causes performance degradation resulting from high CPU waits from disk I/O latency and from a long queue of I/O processes.

Performance degradation caused by these actions or by related issues such as a larger memory footprint incurred by the use of many large custom codes or high memory consumption, leads to degradation in load generation capability. High CPU usage on load generation hardware and poor response times reported by the performance test tool are directly related because the test tool is sometimes in a hung situation resulting from high CPU usage. Because the test tool reports response times after adding its own latency in receiving the response, the performance measurement is skewed.

Monitor the utilization of your load generation hardware. Monitor particularly the processor utilization of both workbench and agent machines while running the performance tests. It is helpful to monitor the memory use by the Java process during simulation of a high volume of concurrent users in the system. The utilization of TCP/IP sockets also affects the performance of load generation resulting from a high volume of request generations on the server. Disk I/O utilization must be monitored and checked for disk latencies that result from excessive logging or levels of logging configured in the test schedule.

Ensure that CPU utilization numbers do not exceed a certain threshold limit. Different kinds of application testing demand different thresholds for load generation hardware utilization. In most situations, do not allow the load generation environment to exceed 70%. Set the threshold at 70% of CPU, memory, disk, network, or TCP/IP sockets.

Aim to build your custom codes to keep the performance of the automation tool during load generation within an acceptable limit.


Downloadable resources


Related topics


Comments

Sign in or register to add and subscribe to comments.

static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Rational
ArticleID=975340
ArticleTitle=Use Rational Performance Tester to detect runtime parameterization issues
publish-date=07012014