Test XML messages over WebSphere MQ with Rational Performance Tester: Part 3. Schedule and run the test and then review results

In Part 3 of this three-part tutorial, you will schedule and run the test for the sample scenario and then review results. You will then have a thorough understanding of how to use Rational Performance Tester for end-to-end Java Message Service (JMS) and WebSphere MQ performance tests.

Share:

Bharath Raj Keshavamurthy (bharathrajbk@in.ibm.com), Software performance analyst, IBM

author photoBharath works at IBM as an enterprise solutions performance lead analyst and designs solutions that meet the nonfunctional requirements of the system. He also does work pertaining to end-to-end performance engineering activities for the solution as a whole. He works with cross-brand products, with Java Virtual Machine (JVM) performance as his primary area of research and work.



23 October 2012

Create a schedule that runs multiple iterations of the test

This section explains in detail how to create and run a performance test schedule.

  1. Click File > New > Performance Test Schedule.
  2. Enter the name of the performance test schedule, and click Next.
Figure 1. Performance Schedule dialog window
Name your test schedule
  1. Specify the number of users, stages, and related details, and then click Finish.
Figure 2. Schedule Options dialog window
Specify numbers: users, stages, user groups, loops
  1. Add the test within the user group, and Save the schedule.
    1. Click User Group.
    2. Click the Add button.
    3. Click Test to add it in the user group.
Figure 3. Adding the test to the user group
Selections on two drop-down menus
Figure 4. Sample view of a performance test schedule
Contents and element details

Execute the test and examine the test log

Configure the think time and logging level, and then execute the test schedule by clicking the Run button.

Sample run output
This section provides an explanation of the test results of the Java Message Service (JMS) and WebSphere MQ performance test runs, as well as information about the key parameters that need to be observed.
 
Overall tab
This tab provides information about the overall test status in terms of percentage of requests completed successfully.
 
Figure 5. Overall tab view
Overall test progress
Response time results
This tab in the report is significant in analysis of the execution time for individual messages. Using this information, you can understand how quickly the messages are being processed (end-to-end processing) and gauge the performance behavior of the messages.
 
Figure 6. Average and maximum response times for test run
Response Time Results page

Response Time vs Time details

This tab provides a good understanding of the behavior, over time, of the system during message processing.

  • If the graph is parallel to the X axis, it means that the system is consistently performing well.
  • If the graph leans down toward the X axis toward the right, it means that the systems are performing faster and faster as the messages are processed. This might be due to caching of data during previous iterations of message processing.
  • However, if the graph grows away from the X axis toward the right, it means that the system performance is deteriorating as time progresses.
Figure 7. Sample graph for processing one message
Response time vs Time details
Data volume sent and received
This graph shows the data volume (network bytes) sent and received. With this information, you will be able to clearly understand and relate to the total network bandwidth requirement and size of each message.
 
Figure 8. Data volume sent and received
Graph of sent and received data volume
Request throughput
This is also a very significant graph. You can determine the total number of requests processed and the time it took to process them. In other words, you will be able to establish the request throughput (messages processed per second). The graph also provides the total number of requests processed per second, total number of failures in processing per second (negative testing), and so on.
 
Figure 9. Request throughput
Graphs of requests performed rate and user load

You are now ready to use Rational Performance Tester to do end-to-end Java Message Service (JMS) and WebSphere MQ performance testing on your own.

Resources

Learn

Get products and technologies

Discuss

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Rational software on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Rational
ArticleID=841320
ArticleTitle=Test XML messages over WebSphere MQ with Rational Performance Tester: Part 3. Schedule and run the test and then review results
publish-date=10232012