Contents


Test WebSphere performance with Apache JMeter

An open source tool ideal for testing IFX messaging middleware

Comments

Find out how the Apache open source tool JMeter was deployed to test the responsiveness of a middleware solution based on IBM® WebSphere Application Server. Performance tests were designed to simulate varying concurrent user loads using a variety of Interactive Financial eXchange (IFX) request messages. If your project's performance testing budget is limited and your solution employs XML messaging, then the lessons learned at the conclusion of this article may help you plan your own performance testing strategy.

Performance testing challenges for a high-visibility project

A recent project for a financial institution delivered a middleware infrastructure to support a growing list of applications that required access to the business' core financial systems. The architectural direction was to mandate that all core financial system requests be routed through this middleware solution using the XML-based IFX messaging standard. Figure 1 shows the middleware infrastructure in relation to the first application to use it (shown in bold), and future applications and future back-end systems (shown in gray).

Figure 1. The solution to be tested
Figure 1. The solution to be tested
Figure 1. The solution to be tested

For this high-visibility solution to gain acceptance, it had to demonstrate optimum performance under load. This was especially important for response-time sensitive clients, such as the contact center's CRM application. Another consideration was the need to reuse the selected performance testing approach as new applications are brought online both "in front of" and behind the middleware (as in Figure 1, which shows a future implementation of a corporate and consumer credit card services system behind the middleware).

No user interface

The first application designated to use the middleware infrastructure, a credit processing application, was scheduled to be implemented after completion of the middleware project. This meant that the test team had to devise tests to simulate production loads without the benefit of a user interface to prepare and submit middleware requests.

Limited budget

The financial institution did not have an appropriate toolset to support middleware performance testing. The challenge, therefore, was to confidently report the observed middleware performance characteristics with a minimal budget for tools and preparation effort.

JMeter to the rescue

Research of available open source testing tools revealed that Apache JMeter could support the middleware performance test requirements. JMeter provides a GUI-based application to design and execute a variety of reusable test plans. JMeter also supports the capture of test results in XML format for post-test statistical analysis. These two features helped the test team develop and document repeatable test results, meeting the high-visibility challenge.

Many open source test tools are designed to test Web sites, with the expectation that the test should emulate user interaction with one or more pages or forms. Because an application Web interface was not available at the time the middleware solution was tested, the selected tool had to support XML-based messaging without browser interaction. JMeter's SOAP/XML request component met this requirement.

Finally, the fact that JMeter is a product of the Apache software foundation meant that the project was not required to fund the licensing costs of a commercial testing tool, satisfying the limited-budget condition.

Designing the test scripts

The objective of the performance test was to submit, under various concurrent load conditions, a random selection of predefined IFX-encoded request messages and record the elapsed time to receive the anticipated IFX-encoded response. The following five JMeter test plan components were used to prepare the performance test scripts:

Test plan

This is the master component for a test. The test name was specified here according to the project's naming convention. The Functional Test Mode was also selected so that the full IFX-encoded response was captured in the test results managed by the View Results Tree component.

Figure 2. JMeter test plan
Figure 2. JMeter test plan
Figure 2. JMeter test plan

HTTP header manager

This component was used to specify the HTTP header values required by the middleware. Each IFX-encoded request sent to the middleware would include these HTTP header values.

Figure 3. JMeter HTTP header manager
Figure 3. JMeter HTTP Header Manager
Figure 3. JMeter HTTP Header Manager

Thread group

This component was repeated as required in a test plan to emulate a specific number of concurrent users. For example, to emulate 5 concurrent users, 5 Thread Groups were specified.

Figure 4. JMeter thread group
Figure 4. JMeter Thread Group
Figure 4. JMeter Thread Group

Note that the Thread Group component has a field labeled Number of Threads This field controls the number of threads associated with a Thread Group. The decision was made to limit each Thread Group to one thread since each Thread Group had a unique set of randomly selected IFX-encoded requests (see SOAP/XML-RPC Request below). If multiple threads were specified for one or more Thread Groups, then the same message set would have been sent multiple times, defeating the purpose of the random selection criterion.

SOAP/XML-RPC request

This component was repeated for the desired number of IFX-encoded requests that each Thread Group was to send. The actual IFX-encoded request was specified in this component.

Figure 5. JMeter SOAP/XML-RPC request
Figure 5. JMeter SOAP/XML-RPC Request
Figure 5. JMeter SOAP/XML-RPC Request

View Results Tree

This component served two purposes. As the test was executing, this user interface displayed the test progress as messages were sent and received. In addition, this component wrote the test results to a file for post-test analysis.

Figure 6. JMeter View Results Tree
Figure 6. JMeter View Results Tree
Figure 6. JMeter View Results Tree

JMeter test plans were devised to emulate a variety of concurrent user loads, from a single user through to a maximum of 80 concurrent users. For all test plans, the five components described above were deployed in a consistent manner to simplify performance test execution.

Building the test scripts

Once the required JMeter components were identified and a general test plan design conceived, the test scripts had to be built. Fortunately, there was a set of over 300 IFX-encoded model request messages and associated test data available from the System Integration Test (see Related topics) that could be reused. The challenge was in preparing test scripts that could send up to 8,000 (100 per thread for 80 threads) randomly selected request messages. The messages were selected randomly to better approximate the steady state of production conditions, in which no one request type was likely to be submitted more than another. Using the JMeter user interface alone, this would have meant manually cutting and pasting messages into as many as 8,000 SOAP/XML-RPC Requests. To further complicate the task, each request also required a unique RQUID, according to the financial institution's IFX specification.

Automating test-script creation

As mentioned, this project's performance testing approach was to be reused for future middleware releases. For this reason, the test team invested some effort in preparing a Java application that would output JMeter XML-encoded test scripts based on specified parameters. The Java application, Scripter, can prepare a performance test script that has a specified number of threads and specified number of IFX-encoded messages per thread, selected at random by the application. The IFX-encoded messages are sourced from a message set provided in a directory specified by Scripter's properties file.

You can download source code and usage instructions for the Scripter Java application from the link in the Related topics.

Executing the tests

JMeter was installed on a two-way IBM eServer xSeries® 360 server with 2-GB RAM running Windows® 2000. Figure 7 shows the test configuration.

Figure 7. JMeter performance testing configuration
Figure 7. JMeter performance testing configuration
Figure 7. JMeter performance testing configuration

As the tests were executed, IFX-encoded responses were recorded so that the captured MQ Time and Total Time metrics, embedded by the middleware in the responses, could be analyzed. The JMeter Time, as observed by JMeter, was also analyzed, though this number includes the network latency between the middleware and JMeter.

The test team executed three cycles of performance tests, with modifications and configuration tuning after the first two cycles to improve application performance.

Analyzing the results

The test team used Microsoft® Excel® spreadsheets to import test results and perform statistical calculations on the elapsed time metrics described above. The results were then graphed to show that the application provided subsecond responsiveness for a majority of test conditions.

Lessons learned

In summary, JMeter was an excellent choice as this project's performance test tool. The following lessons learned provide additional detail.

JMeter met our needs

JMeter was easy to install and of medium complexity to learn (see the next lesson learned). The selected JMeter components supported a common structure for all the performance test scripts. The XML-encoded output of test results was also a handy feature for post-test analysis, as this option captured the embedded performance statistics in the IFX-encoded reply messages.

JMeter users should have technical skills

To properly prepare performance test scripts, the script developer must have a good understanding of distributed applications using HTTP and XML protocols. A business user may find it difficult to work with the technical specifications for the various JMeter components.

Creating large scripts may require additional automation

The characteristics of our performance tests (randomized message selection, concurrency, and unique values embedded in each IFX-encoded request) required an automated approach to generating test scripts. The test team fortunately had sufficient Java technical skills to automate this task. This application is provided at the end of this article for those who may have similar needs.

If time (and talent!) had allowed, the team might have developed a new JMeter component that conformed to this project's needs and submitted it back to the Apache organization.

Custom performance metrics can assist with problem determination

The JMeter application can measure elapsed time between the transmission of an IFX-encoded request and the receipt of an IFX-encoded reply. This metric, however, does not provide insight into potential bottlenecks throughout the distributed middleware solution. The middleware development team provided additional performance metrics to isolate the elapsed time for host communication, message parsing, and the middleware elapsed time for transaction processing. These metrics were embedded as XML comments in the IFX-encoded reply.


Downloadable resources


Related topics


Comments

Sign in or register to add and subscribe to comments.

static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Open source
ArticleID=11922
ArticleTitle=Test WebSphere performance with Apache JMeter
publish-date=05272004