Find out how the Apache open source tool JMeter was deployed to test the responsiveness of a middleware solution based on IBM® WebSphere Application Server and the WebSphere Branch Transformation Toolkit (BTT). Performance tests were designed to simulate varying concurrent user loads using a variety of Interactive Financial eXchange (IFX) request messages. If your project's performance testing budget is limited and your solution employs XML messaging, then the lessons learned at the conclusion of this article may help you plan your own performance testing strategy.
A recent project for a financial institution delivered a middleware infrastructure to support a growing list of applications that required access to the business' core financial systems. The architectural direction was to mandate that all core financial system requests be routed through this middleware solution using the XML-based IFX messaging standard. Figure 1 shows the middleware infrastructure in relation to the first application to use it (shown in bold), and future applications and future back-end systems (shown in gray).
Figure 1. The solution to be tested
For this high-visibility solution to gain acceptance, it had to demonstrate optimum performance under load. This was especially important for response-time sensitive clients, such as the contact center's CRM application. Another consideration was the need to reuse the selected performance testing approach as new applications are brought online both "in front of" and behind the middleware (as in Figure 1, which shows a future implementation of a corporate and consumer credit card services system behind the middleware).
The first application designated to use the middleware infrastructure, a credit processing application, was scheduled to be implemented after completion of the middleware project. This meant that the test team had to devise tests to simulate production loads without the benefit of a user interface to prepare and submit middleware requests.
The financial institution did not have an appropriate toolset to support middleware performance testing. The challenge, therefore, was to confidently report the observed middleware performance characteristics with a minimal budget for tools and preparation effort.
Research of available open source testing tools revealed that Apache JMeter could support the middleware performance test requirements. JMeter provides a GUI-based application to design and execute a variety of reusable test plans. JMeter also supports the capture of test results in XML format for post-test statistical analysis. These two features helped the test team develop and document repeatable test results, meeting the high-visibility challenge.
Many open source test tools are designed to test Web sites, with the expectation that the test should emulate user interaction with one or more pages or forms. Because an application Web interface was not available at the time the middleware solution was tested, the selected tool had to support XML-based messaging without browser interaction. JMeter's SOAP/XML request component met this requirement.
Finally, the fact that JMeter is a product of the Apache software foundation meant that the project was not required to fund the licensing costs of a commercial testing tool, satisfying the limited-budget condition.
The objective of the performance test was to submit, under various concurrent load conditions, a random selection of predefined IFX-encoded request messages and record the elapsed time to receive the anticipated IFX-encoded response. The following five JMeter test plan components were used to prepare the performance test scripts:
This is the master component for a test. The test name was specified here
according to the project's naming convention. The Functional Test Mode was also
selected so that the full IFX-encoded response was captured in the test
results managed by the
View Results Tree
Figure 2. JMeter test plan
This component was used to specify the HTTP header values required by the middleware. Each IFX-encoded request sent to the middleware would include these HTTP header values.
Figure 3. JMeter HTTP header manager
This component was repeated as required in a test plan to emulate a
specific number of concurrent users. For example, to emulate 5 concurrent
Thread Groups were specified.
Figure 4. JMeter thread group
Note that the
component has a field labeled
Number of Threads This field controls the number of
threads associated with a Thread Group. The decision was made to limit
each Thread Group to one thread since each Thread Group had a unique set
of randomly selected IFX-encoded requests
(see SOAP/XML-RPC Request below). If multiple threads
were specified for one or more Thread Groups,
then the same message set would have been sent multiple times, defeating
the purpose of the random selection criterion.
This component was repeated for the desired number of IFX-encoded requests
Thread Group was to send. The actual
IFX-encoded request was specified in this component.
Figure 5. JMeter SOAP/XML-RPC request
This component served two purposes. As the test was executing, this user interface displayed the test progress as messages were sent and received. In addition, this component wrote the test results to a file for post-test analysis.
Figure 6. JMeter View Results Tree
JMeter test plans were devised to emulate a variety of concurrent user loads, from a single user through to a maximum of 80 concurrent users. For all test plans, the five components described above were deployed in a consistent manner to simplify performance test execution.
Once the required JMeter components were identified and a general test
plan design conceived, the test scripts had to be built. Fortunately,
there was a set of over 300 IFX-encoded model request messages and
associated test data available from the System Integration Test
(see Resources) that could
be reused. The challenge was in preparing test scripts that could send up
to 8,000 (100 per thread for 80 threads) randomly selected request
messages. The messages were selected randomly to better approximate the
steady state of production conditions, in which no one request type was
likely to be submitted more than another. Using the JMeter user interface
alone, this would have meant manually cutting and pasting messages into as
many as 8,000 SOAP/XML-RPC Requests. To further complicate the task, each
request also required a unique
according to the financial institution's IFX specification.
As mentioned, this project's performance testing approach was to be reused for future middleware releases. For this reason, the test team invested some effort in preparing a Java application that would output JMeter XML-encoded test scripts based on specified parameters. The Java application, Scripter, can prepare a performance test script that has a specified number of threads and specified number of IFX-encoded messages per thread, selected at random by the application. The IFX-encoded messages are sourced from a message set provided in a directory specified by Scripter's properties file.
You can download source code and usage instructions for the Scripter Java application from the link in the Resources.
JMeter was installed on a two-way IBM eServer xSeries® 360 server with 2-GB RAM running Windows® 2000. Figure 7 shows the test configuration.
Figure 7. JMeter performance testing configuration
As the tests were executed, IFX-encoded responses were recorded so that the captured MQ Time and Total Time metrics, embedded by the middleware in the responses, could be analyzed. The JMeter Time, as observed by JMeter, was also analyzed, though this number includes the network latency between the middleware and JMeter.
The test team executed three cycles of performance tests, with modifications and configuration tuning after the first two cycles to improve application performance.
The test team used Microsoft® Excel® spreadsheets to import test results and perform statistical calculations on the elapsed time metrics described above. The results were then graphed to show that the application provided subsecond responsiveness for a majority of test conditions.
In summary, JMeter was an excellent choice as this project's performance test tool. The following lessons learned provide additional detail.
JMeter was easy to install and of medium complexity to learn (see the next lesson learned). The selected JMeter components supported a common structure for all the performance test scripts. The XML-encoded output of test results was also a handy feature for post-test analysis, as this option captured the embedded performance statistics in the IFX-encoded reply messages.
To properly prepare performance test scripts, the script developer must have a good understanding of distributed applications using HTTP and XML protocols. A business user may find it difficult to work with the technical specifications for the various JMeter components.
The characteristics of our performance tests (randomized message selection, concurrency, and unique values embedded in each IFX-encoded request) required an automated approach to generating test scripts. The test team fortunately had sufficient Java technical skills to automate this task. This application is provided at the end of this article for those who may have similar needs.
If time (and talent!) had allowed, the team might have developed a new JMeter component that conformed to this project's needs and submitted it back to the Apache organization.
The JMeter application can measure elapsed time between the transmission of an IFX-encoded request and the receipt of an IFX-encoded reply. This metric, however, does not provide insight into potential bottlenecks throughout the distributed middleware solution. The middleware development team provided additional performance metrics to isolate the elapsed time for host communication, message parsing, and the middleware elapsed time for transaction processing. These metrics were embedded as XML comments in the IFX-encoded reply.
The test team used Microsoft
Excel spreadsheets to import test results and perform statistical
calculations on the elapsed time metrics. Open source alternatives
include Gnumeric and
Open Office Spreadsheet.
The test team used OpenSTA as a System Integration Test.
Your DB2 Database to the Test: Measuring Performance with JMeter"
employs JMeter to measure query performance
and throughput for DB2 Universal Database.
on performance: A load of stress" discusses J2EE application testing.
application testing with Puffin: Puffin testing framework, Part 1" describes Puffin, a pure Python framework
for testing Web applications.
You may also enjoy the series titled "Beyond
performance testing," which offers an an in-depth
exploration of testing by Scott Barber.
An entire 2002 issue of IBM Systems Journal was dedicated to the theme of
IBM even offers Infrastructure and systems management.
The JMeter tests described in this article were developed to test an
application based on the WebSphere
Application Server and the Branch Transformation Toolkit for WebSphere Studio. You can also download
trial versions of many WebSphere products.
You'll find more articles and resources on the Java programming language
in the developerWorks Java technology zone.
Check out the "Recommended Eclipse reading list."
Browse all the Eclipse content on developerWorks.
Users new to Eclipse should check out Eclipse project resources' Start Here.
Expand your Eclipse skills by checking out IBM developerWorks' Eclipse project resources.
To listen to interesting interviews and discussions for software developers, check out check out developerWorks podcasts.
For an introduction to the Eclipse platform, see "Getting started with the Eclipse Platform."
Stay current with developerWorks' Technical events and webcasts.
Watch and learn about IBM and open source technologies and product functions with the no-cost developerWorks On demand demos.
Check out upcoming conferences, trade shows, webcasts, and other Events around the world that are of interest to IBM open source developers.
Visit the developerWorks Open source zone for extensive how-to information, tools, and project updates to help you develop with open source technologies and use them with IBM's products.
Get products and technologies
In addition to Web applications, Apache JMeter is also happy to test
files, servlets, Perl scripts, Java objects, databases and queries, FTP
servers, and more. JMeter is part of the Apache Jakarta Project and can
be downloaded from the Jakara Project's Binary Downloads.
Download the Scripter Java application, which was developed
using IBM WebSphere Studio Application Developer. The ZIP file contains
the Scripter source code and a readme file with usage hints and tips.
Check out the latest Eclipse technology downloads at IBM alphaWorks.
Download Eclipse Platform and other projects from the Eclipse Foundation.
Download IBM product evaluation versions, and get your hands on application development tools and middleware products from DB2®, Lotus®, Rational®, Tivoli®, and WebSphere®.
Innovate your next open source development project with IBM trial software, available for download or on DVD.
The Interactive Financial eXchange (IFX)
Forum mission is to develop a robust XML framework for the electronic
business-to-business exchange of data among financial service institutions
around the world.
The Eclipse Platform newsgroups should be your first stop to discuss questions regarding Eclipse. (Selecting this will launch your default Usenet news reader application and open eclipse.platform.)
The Eclipse newsgroups has many resources for people interested in using and extending Eclipse.
Participate in developerWorks blogs and get involved in the developerWorks community.
Greg Herringer is an IT Architect with 15 years experience in customer relationship management and contact center technologies, with focus on the financial services and public sector industries. He has a wide variety of experience across the entire application development life cycle. Greg can be contacted at email@example.com.