IBM Cognos Proven Practices

IBM Cognos 10.1 BI JMeter Stability Test

Nature of Document: Proven Practice; Product: IBM Cognos 10.1 BI Server; Area of Interest: Performance; Version: 1.0


Content series:

This content is part # of # in the series: IBM Cognos Proven Practices

Stay tuned for additional content in this series.

This content is part of the series:IBM Cognos Proven Practices

Stay tuned for additional content in this series.


This document accompanies a test plan that is used with Apache JMeter to stress test an IBM Cognos 10 BI Server install for stability. It describes how to use JMeter to run the test plan and configure it for a given set-up. In addition a brief touch on the backgrounds of IBM Cognos 10 BI Server report execution is provided and some recommendations about usage and result interpretation are given as well.

The test plan and process described in this document has been used successfully in a number of client installs and in some instances of IBM Cognos led scalability testing.

This stability test plan is a good start whenever he answer to any one of the following questions is “yes”,

  • There is no other load and stability testing software such as Rational Performance Tester or Mercury LoadRunner available.
  • The system behaves unstable under load, requests fail, response times are slow, unexpected error messages occur, one needs a tool to provoke the errors in order to troubleshoot.
  • The install needs to be checked for stability and load before moving it to the next release stage such as Testing or Pre-Production.
  • One wants to test scalability and needs quick answers on the performance gain for example by adding another instance of Report Service.


The test plan must be run by Apache JMeter version 2.4 or above. The plan is not compatible with earlier versions of Jakarta JMeter.

JMeter is available on almost every platform/OS as it’s a Java based tool, therefore the plan can be run from all supported JMeter platforms as well.

The test plan has been recently tested on various operating systems and different set-ups. Any IBM Cognos 10 BI Server set-up is suitable for testing regardless of set-up topology or mix of operating systems or bitness in use. The use of IBM Cognos Application Firewall (CAF) is fully supported, as it is recommended to leave CAF enabled all times.

Exclusions and Exceptions

The test plan presented as part of this package was designed for IBM Cognos 10 BI Server release 10.1 and 10.1 FP1 ( , running it against other versions may not work due to possible changes in the HTML requests and come up with wrong results. Do not try other versions unless explicitly stated in this document. The document will get updated once new fix or refresh packs become available.

The test plan currently runs reports from the sample packages provided with the product in IBM Cognos Viewer. It does not include the following

  • Running any Studio, Dashboard (Business Insight) or Active Reports
  • Packages leveraging Dynamic Query Mode
  • SSL enabled Gateways


The test plan is provided “as is” and its sole purpose is to provide a quick, versatile and very handy free and easy means for checking stability of a possibly complex install of IBM Cognos 10 BI Server and it’s stability in load situations.

The test plan is considered to be simply a starting point for looking into stability and system operation in complex environments. Any more serious analysis, in particular in regards to performance, should be part of an IBM Services engagement and should be done using a commercial load testing tool such as IBM Rational Performance Tester. While the results from running the provided test plan may provide valuable indications about IBM Cognos 10 configuration issues, necessary 3rd party configuration/tuning and possible stability issues, you should always bring those indications forward to IBM in order to interpret the indications and to resolve any issues. IBM Cognos 10 is a complex product and results or issues that may surface by running the test plan may not always relate to a single specific reason and it often requires extensive knowledge of architecture and process flow in order to interpret the results correctly. Searching the IBM Information Center and the IBM Cognos Support web site for error messages encountered is always encouraged.

The test plan itself is not subject to any formal support from IBM Support. If you are interested in detailed, individual load-testing and performance tuning exercises, please contact your IBM Cognos representative.

Some Background


JMeter is a pure Java based tool developed in an open source software project of the Jakarta organization. Its purpose is to issue requests of several kinds to some server and collect the response. JMeter supports HTML, XML, SOAP, JDBC and other requests. It can execute multiple threads and therefore simulate real users hitting a system. It allows building test plans manually or by recording those compiled from many requests through some proxy. Plans can be enriched with programmatic logic like conditional execution, parsing of responses and much more.

The home of the JMeter tool is

For details on using JMeter and what each of the elements in a plan is doing, please consult the JMeter documentation.

IBM Cognos 10 Report Execution

IBM Cognos uses the concept of conversations when executing reports of any type. A conversation is an operation consisting of multiple steps, each step technically a single request. Thus executing a report becomes a series of requests and responses rather than a single request with a single response.

Conversations start with a primary request and can have one to many secondary requests belonging to the same conversation. Each conversation starts out in synchronous mode, which means that the sender of the request waits for the recipient to respond. During that time the sender is blocked. Since this is inefficient if kept up for more than several seconds, there is a time-out known as the Primary Wait Threshold (PWT) after which a conversation will go into asynchronous mode. At that time the sequence of requests changes.

When the conversation has gone asynchronous, the sender of the request will have to send heartbeat requests at specific intervals to keep the conversation active. If those heartbeats cease, the receiving service will stop processing the current request and thus the conversation ends with an error instead of a result. The heartbeat has a time-out of 30 seconds by default and is known as the Secondary Wait Threshold (SWT). So every 30 seconds the client needs to send a wait() request to the same service to keep the conversation alive and indicate it is still waiting for the result. The target service continues to process and signals when it completed processing the current request and the result can be retrieved by the sender/client. How many iterations of wait() requests are required completely depends on the system resources. If system load is high, e.g. many concurrent users, the system may require more time to process a single conversation than it would need if there was only one user using the system. That is, each conversation will potentially look different at the HTML level depending on the actual system load situation.

This is the main reason why one cannot simply record a report execution with some tool and play it back. If a primary request is not answered within the time limit of the PWT, the conversation will have to go asynchronous, however if this is not recorded, the request will fail. Bottom line, each conversation can be different in the number of required requests each time it gets executed.

All of this is reflected by the requests which make up a conversation. If the client is a browser running the IBM Cognos Viewer tool, the handling of the conversation is done by JavaScript embedded in the HTML pages that make up the IBM Cognos Viewer too. If the client is an IBM Cognos SDK program, the conversation handling has to be implemented by the client itself. This applies to the JMeter plan as well, which at this point has to mimic the JavaScript functionality by carrying forth specific HTML form variables.

Conversations are used whenever something is executed, meaning reports, agents, analysis, dashboards etc. When IBM Cognos services interact amongst each other, conversations happen as well and each of the services implement the client side of conversation handling.

The processing of conversations in a client application is covered in the IBM Cognos SDK Developer Guide which is part of the IBM Cognos SDK product. However, for the purposes of this document the above information is sufficient.

Some Considerations Before you Start

Where to install JMeter

During the test the JMeter application will simulate user sessions to IBM Cognos 10.1 through a browser. JMeter opens a thread for each session (concurrent user) so a sufficiently fast CPU and amount of RAM will be necessary to avoid bottlenecks on the JMeter side.

To get the most out of the test the JMeter application should be run from a separate box where no IBM Cognos 10.1 components or databases are installed. As stated above, depending on the number of users you want to simulate, there could be a large number of threads and sockets being used which may significantly impact the machine performance and if the box is also busy running IBM Cognos 10.1, adding the additional resource demand of JMeter is not a good idea.

Due to superior threading support, a box running Linux would be best suited to run the JMeter test but any box with a fast processor (1.5 GHz+) and enough RAM (1GB+) should do, it all depends on the number of threads you want to simulate. Anything beyond 30 users should be run on a system which matches above mentioned characteristics. If simulating less, just pick any box you can spare. These recommendations are based on personal experience of the author rather than through wide-scale testing.

Gateway challenges

When simulating many users who access IBM Cognos BI Server through a gateway there is one thing to be aware of. If using a CGI gateway the web server will instantiate one instance of the CGI executable for each user session. Each instance will be a separate process with it's own memory and resources. Take this into consideration when you scale the number of users. You might just overload your gateway machine.

Therefore the best practice is to leverage the far better performing alternate gateway implementation like MOD, MOD2, MOD2_2 for Apache web servers or ISAPI for Microsoft IIS based gateways. CGI gateways are most generic and supported by every web server but they are not suitable for production systems and even less for systems exposed to heavy load.

Bottom line, use MOD or ISAPI whenever possible. The test plan will work with any gateway and against and IBM Cognos Dispatcher directly as well.

Use monitoring tools

While testing with the provided plan, a closer look to the resources consumed at the operating system level and the query or Content Manager database is advised. Imposing some higher load onto the system may identify bottlenecks or challenges you haven't seen before.

You should have sufficient access and privileges to run administration tools like IBM DB2 Control Center, Oracle Enterprise Manager, MSSQL Enterprise Console, operating system monitoring commands and the like.

As of IBM Cognos 8.3, comprehensive system monitoring is part of the product itself. Launch a browser on a separate box and use the IBM Cognos Administration tool to monitor things such as,

  • Number of Report Service processes spawned
  • Number of threads opened for each Report Service process
  • Number of threads opened for the IBM Cognos 10 components
  • Number of connections to the query and content store DB
  • Memory consumption on the Content Manager and Dispatchers
  • Number of interactive tasks

This is not a comprehensive list but it provides a reasonable start.

For more detailed monitoring though JMX, which is available as of IBM Cognos 8.4, please refer to the System Management Methodology For IBM Cognos 10 which is available on the Cognos Proven Practice section of IBM developerWorks through the following URL,


Before you can run the test plan you need to install JMeter 2.4 and set up the IBM Cognos 10.1 samples.

Check JMeter prerequisites

Since JMeter is a Java based tool you need to have a working JRE environment to run JMeter. JMeter 2.4 requires JRE 1.5 at least, 1.6 is even better.

The development of this version of the plan was done using JMeter 2.4 with JRE

More details about JMeter prerequisites can be found here,

Install JMeter

First thing to do is getting the Apache JMeter Software and install it on the machine appointed to run it. You can download JMeter for various platforms here,

After you downloaded a binary, simply deflate it using the tool applicable to your operating system (ZIP, TAR, GZIP). Set your JRE environment variables, in particular JAVA_HOME and you’re ready to go. Start JMeter by invoking the jmeter.bat or script. For help about advanced features of JMeter like command line mode, SSL support and more, refer to

Install the IBM Cognos 10 BI sample packages

The test plan discussed in this document will run reports from the following sample packages provided with IBM Cognos 10.1.

  • GO Data Warehouse (query) -> “Cognos Samples” deployment
  • GO Data Warehouse (analysis) -> “Cognos Samples” deployment
  • GO Sales (query) -> “Cognos Samples” deployment
  • Sales and Marketing (cube) -> “Cognos PowerCube” deployment

While the GO Data Warehouse packages and the GO Sales package use relational data sources, the Sales and Marketing package is based on an IBM Cognos PowerCube.

Installation instructions for these sample packages can be found in Chapter 9 of the IBM Cognos 10 Installation and Configuration Guide or in the online Information Center for IBM Cognos 10.

Configuration of Test Plan

After you started JMeter and loaded the test plan you’ll get presented with the JMeter GUI. The left pane allows you to browse the test plan in a hierarchical tree structure. In the right pane the properties and settings of the element selected in the left pane will be presented.

Illustration 1: JMeter expanded tree of test plan elements
Illustration 1: JMeter expanded tree of test plan elements
Illustration 1: JMeter expanded tree of test plan elements

JMeter expands all the elements in the tree pane by default as of version 2.1.2. This default behavior makes getting an overall view of the test plan more difficult so it is recommended to tell JMeter to not expand all elements on startup. To do this, start JMeter with the command line switch -Jonload.expandtree=false or edit and add the line onload.expandtree=false.

Once JMeter opens up, expand the top level node “IBM Cognos 10.1 – BI Stability package” which will display another node of the same name. Expand this as well. Now, expand the “Overall Tests” element and you'll have a good overview of the plan as shown in the next illustration.

Illustration 2: JMeter collapsed tree of test plan elements with the logon/logoff requests, the requests to be executed and configuration properties highlighted
Illustration 2: JMeter collapsed tree of test plan elements with the logon/logoff requests, the requests to be executed and configuration properties highlighted
Illustration 2: JMeter collapsed tree of test plan elements with the logon/logoff requests, the requests to be executed and configuration properties highlighted

Once all elements of the second level have been collapsed this will provide a good overview of the test plan’s structure. There are four general configuration elements, one top element containing all the test plan steps and two elements involved with visualizing the test results.

In above screenshot the main configuration element “Test configuration Parameters” (highlighted in red) is involved when adjusting the plan to a specific environment by editing it's properties (highlighted in yellow). More detail on these configuration properties will be provided in the Configure Parameters section.

The test itself features two elements which handle logon and logoff requests (highlighted green) which will be explained in Configure Authentication section and eight elements that execute a report (highlighted in light blue) which will be discussed in the next section.

Understand the plan structure and disable specific tests

The test plan runs eight different reports, some including prompting, using different output formats. Below is the list of reports run along with some details about the report,

  1. Budget vs Actual, GO Data Warehouse (analysis)
    • Is run in HTML format
    • Does not involve prompting
    • One “Next Page” operation, once the first page is served
    • Basically a very simple cross-tab report
  2. Order Invoices - Donald Chow, Sales Person , GO Sales (query)
    • Is run in PDF format
    • Does not involve prompting
    • The PDF (257 pages) is retrieved from the CS
    • A report rendering some list report with some layout and branding
  3. Planned Headcount, GO Data Warehouse (analysis)
    • Is run in HTML format
    • Does not involve prompting
    • A report containing 5 charts will get rendered. The plan will download the individual GIF representations of the charts to simulate the viewing of the HTML report output.
  4. Global Bonus, GO Data Warehouse (analysis)
    • Is run in HTML format
    • Has one prompt page with two parameters. Values used are Year=2005 and Region=Americas
    • A list report
  5. Pension Plan, GO Data Warehouse (query)
    • Is run in CSV format
    • No prompting
    • A list report with much data
  6. Recruitment Report, GO Data Warehouse (analysis)
    • Is run in XLWA format
    • Has one prompt page with one parameter. Values used are Year=2006
    • A report containing several graphs and a table
  7. Historical Revenue , Sales and Marketing (cube)
    • Is run in HTML format
    • Has two prompt pages with one parameter each. Values used are Year=2006 and Month=June
    • A single chart report, the chart get's retrieved
  8. Revenue by Product Brand (2005) , Sales and Marketing (cube)
    • Is run in HTML format
    • Has no prompt page
    • A report containing 2 charts and two lists, the charts get retrieved

The test plan is displayed in a hierarchical tree structure. The top element is “Overall Test” which contains all sub-elements. The elements are numbered and elements 01 and 99 are used for authentication only (refer to Configure Authentication section for details). All other elements execute a specific report. If you want to exclude some report from the test this is easily achieved by right-clicking on the element and selecting “Disable” from the appearing context menu. Those reports will be skipped when running the test.

Illustration 3: Disable plan elements by right-clicking on them and select Disable
Illustration 3: Disable plan elements by right-clicking on them and select Disable
Illustration 3: Disable plan elements by right-clicking on them and select Disable

You will see the disabled element being greyed out in the left tree pane. You can re-enable this element by accessing the same context-menu again by right clicking on it and selecting “Enable”.

Since the test plan can be perceived as a hierarchical tree, one element or node can contain other nodes which become child elements in that context. The part of the test plan which executes reports is a single element of a special type, a so called “Transaction controller”. This basically implies that all child elements get treated as one single transaction in regards to measuring execution time and outcome. On the 2nd level, there is one child element of the overall reports element for each report executed during the test. Again, these per report elements are transaction controllers. This allows one to get an overall time total for all steps of the conversation executing the specific report regardless of report details and steps.

At the very end of the test plan you will find two Listener elements. These elements will record every request, the response and present the data in some user friendly form. The “View Result Tree” element will present a tree-view of the plan’s requests being issued and the “Aggregate Report” element will show a view similar to a list report. Section 6 will have more details about these Listener elements. Additional Listener elements can be added by right-clicking and adding them from the Listener category. Choices include graphs and statistical views. Refer to the JMeter documentation for details.

Configure Parameters

The element labelled “Test configuration parameters” is one of the two places one will need to touch to configure the plan for a given environment. There are typically five parameters which need to be specified and tailored to suit a given environment.

  • servername - Specify the NetBIOS or fully qualified server name for your entry point to IBM Cognos BI. This can be any Gateway or an external Dispatcher URI.
  • port - The network port for the entry point.
  • url - The URL to the entry point. This usually is everything that follows the server name and includes a leading “/” character.
    For a Gateway, the form is /<alias>/cgi-bin/<gateway_executable>.
    Gateway example: /ibmcognos/cgi-bin/cognos.cgi
    For an Dispatcher, the form is /<context_root>/servlet/dispatch/ext
    Dispatcher example: /p2pd/servlet/dispatch/ext
  • Namespace - This is the namespace ID of the namespace used for authentication, if any. The namespace ID can be obtained using IBM Cognos Configuration. If you want to run without authentication leave this blank, refer to Configure Authentication section for details.
  • templocation - IBM Cognos 10 changed the location of temporary objects for the report execution from the Content Store to the local file system. That means, by default, charts, graphs and other binary outputs which are part of the report get stored in the locally configured (using IBM Cognos Configuration) Cognos temporary folder instead of the Content Store as in IBM Cognos 8. This affects the URLs used to retrieve those parts when displaying the report. The test plan has to use the setting which matches the configuration of IBM Cognos 10 BI to be able to retrieve the charts for test plan elements 04, 08 and 09. If this property is not set correctly, there won't be any samples labelled “Retrieve graph output” in the result tree viewer. While this doesn't make the test plan fail, it might skew the results or prevent the detection of Graphics Service failures to render the chart since missing charts won't be anticipated. Always set this to a value of FS unless the configuration of IBM Cognos 10 has been changed to use Content Store for the Temporary objects location property (under the overall Dispatcher properties), in which case you should use a value of CM.
    Illustration 4: The Temporary object location property in the Dispatcher properties set to the default value of Server File System
    Illustration 4: The Temporary object location property in the Dispatcher properties set to the default value of Server File System
    Illustration 4: The Temporary object location property in the Dispatcher properties set to the default value of Server File System

There are additional parameters which are used for some more advanced features. You normally don’t need to change these parameters but they can be adjusted if you understand the impact of the change.

  • Outputlocale - With this setting you can specify the output locale to use for the report outputs thus overriding the output locale specified in the IBM Cognos 10 user’s profile.
  • pwt - This specifies the Primary Wait Threshold for the report execution conversation. IBM Cognos Viewer uses a value of 3 seconds, other clients usually use 7 seconds. If set to 0 (zero) this will prevent the conversation from going asynchronous and thus block resources in the system and having each user thread wait for the request result. This will lead to less performance but is sometimes used to troubleshoot issues.
  • swt - This specifies the Secondary Wait Threshold for the report execution conversation. Default is 30 seconds and should not be changed. Lowering the value does not help to retrieve results any faster or earlier as any service can signal to the waiting client that it has completed working on the request independent from the SWT. Along the same thought, increasing the value does not necessarily save resources by less frequent reloads of heartbeat requests being sent. This property is there mostly for troubleshooting purposes.

Configure Authentication

If your test does not involve authentication, that is your IBM Cognos 10 BI allows for anonymous access, simply leave the namespace parameter blank. The test plan will react accordingly and skip the authentication step.

If your test does involve authentication, you first need to provide the namespace ID of the namespace to authenticate against in the namespace parameter as described in the Configure Parameters section.

In addition, you need to provide a file which contains users and passwords from that namespace. Each JMeter thread represents a user and will pick one set of credentials from this file and use it to authenticate against IBM Cognos 10 BI. If the thread reaches the end of the user file it wraps around and starts over from the top. The file needs to be named using the value you specified for the servername parameter in the Configure Parameters section with “users.xml” being appended to that name. The file needs to be placed in the /bin subdirectory of your JMeter install. For example, if your servername is “myserver”, JMeter will look for a file called “myserverusers.xml” in the <JMETER_INSTALL>/bin directory.

If you absolutely need to change the name of that file, expand the “01 – Logon” element and then expand the child elements until you find “Read user from File”. In that element you can specify a different file name. You cannot use relative or absolute path. The file must be put in the /bin subdirectory and only the file name can be specified.

The file referred to must be an XML file which has the following layout:


There can be one to many <thread> elements which then contain one to many <parameter> elements. One user session is represented by a <thread> element so that all parameters for a single user session must be child elements of that <thread> element. Since we aim to provide credentials for a user we have to provide as many <thread> elements as many users we want to simulate, each containing two <parameter> elements which have a <paramname> and a <paramvalue> sub element.

For a new “user” simply repeat the whole <thread> element and specify values in the <paramvalue> sub elements. Do not change the names of any of the elements. If there's fewer <thread> elements than threads spawned for user sessions by JMeter, the actual thread will wrap around and start over from the first element again.

A file that contains multiple users will look something like this,


Keep in mind that a user can log in to IBM Cognos 10 BI multiple times and therefore create multiple sessions. In theory one set of credentials is all you need to simulate any number of users logging in but if you need to audit the test, it’s a good idea to have at least one set of credentials per user which you want to simulate.

Configure number of threads and ramp up

By default the test plan runs through once using only a single thread.

It's most likely that multiple users will need to be simulated so the number of threads and possibly the number of iterations shall be increased. This can be done by clicking on in the second “IBM Cognos 10.1 - BI Stability package” element in the tree. This is the so called “Thread Group” element.

Illustration 5: Properties of the Thread Group element
Illustration 5: Properties of the Thread Group element
Illustration 5: Properties of the Thread Group element

The properties of the Thread Group element will be displayed on the right JMeter pane and the number of threads, ramp up time and a loop count can be specified. JMeter will distribute the creation of new threads evenly over the specified ramp up time. So if you have 5 threads and a ramp up time of 25 seconds a new thread will be started every 5 seconds.

In addition to these parameters, there is a setting which specifies what action to take if a thread hits an error. This is set to “Stop Thread” and you must not change it as it allows you to easily identify errors which occurred during the execution of the test. The plan has been built to verify successful execution of a request. If the result is something unexpected the thread will fail and you can identify the exact position in the plan where this happened.

Running the Test

The test plan can be started and stopped by using the Run menu.

All activity (ie: the result of each single request) will be recorded by the two Listener elements in the plan. It’s a good practice to clear the recorded data before each test run. This can be done from the JMeter “Run” menu by selecting “Clear All”. The keyboard shortcuts are CTRL+E to clear the test and CTRL+S to start the test.

Upon the start there will be a light green indicator in the upper right next to two numbers indicating how many threads of the maximum configured are actually running. For example, the two numbers 10/30 means 10 threads are running and there will be a maximum of 30 threads. If the number of threads running drops below the maximum number of threads after the initial ramp up period as passes, you should check whether the thread completed or stopped due to an error. This can be done in the Listener elements.

Usually after starting the test you click on one of the Listener elements to monitor progress. While the Aggregate Report is more an overview, the tree will show the details. Just click on one to view its content.

The “View Result Tree” element renders a tree view of all the requests being issued and the responses received for them. This tree is “live” so it grows while running the test.

One important point to note is that an element will appear there only after the request has been sent and a response has been received. For example, if you don’t see the initial Login element appearing, it may be due to the fact that you are try to connect to a wrong URL and the web server time-out is 30 seconds.

Keep in mind that elements appear only when a response was received, in other words the request is finished. For each element in the tree, you can click on it and the right pane will show you information about the request, such as the data sent and the response received. Investigate the responses to get an idea of what happened in that request.

Several elements, namely the grouping ones won’t have a response because they don’t state actual requests but just structure test plan requests.

If in the result tree listener a request is shown in red this indicates that the request was not successful and the thread which issued the request was stopped. Take a look at the response tab of that request. This should give you an idea about what happened.

The elements appear to be out of order if you look at the numbering but this is due to the fact that the tree is processed with the child elements being first, followed by the parent element. Therefore the parent element will show up last as seen in the picture below.

Illustration 6: The Result Tree Listener output showing responses in an indented list of numbered elements
Illustration 6: The Result Tree Listener output showing responses in an indented list of numbered elements
Illustration 6: The Result Tree Listener output showing responses in an indented list of numbered elements

Using the Login process as an example, you can see how the Logon action consists of two single requests. Reading top to bottom, there are two responses, indented to the right labelled using sub-numbering like x.y or x.y.z, which appear before the final element labelled “01-Logon” indicating the Logon action is complete. The “01-Logon” element allows you to gauge a time for the complete logon action. This method of displaying results applies to the whole list of results. Summary elements such as “01- Login” are displayed unindented and the steps contained within a summary element are displayed indented, thus forming blocks or groups for each action.

The Aggregate Report Listener will show a single line per distinct test element and several timing values per row like minimum and maximum time for this test element, median, average and so on. You can learn how often an element was executed as well so you can tell if all your configured threads have passed by a certain element yet.

Illustration 7: The Aggregate Result Listener output – example showing a table with response elements for rows and measures taken in the columns
Illustration 7: The Aggregate Result Listener output – example showing a table with response elements for rows and measures taken in the columns
Illustration 7: The Aggregate Result Listener output – example showing a table with response elements for rows and measures taken in the columns

You can read timing from this easily for each action or sub request. There are elements for the overall test and the three data access types as well.

You can stop the test at any time by employing the Stop command from the Run menu or pressing CTRL+. (period). If you run a high number of threads it may take a while for the test to stop.

In rare occasions it has been noticed that JMeter hangs. This usually is noticeable by seeing no more progress in the Listeners and the threads may have died, the indicator in the upper right is off even though the test didn’t complete yet. Just rerun the test, don’t worry troubleshooting. It has never occurred more than once or twice during a session to the author and with JMeter maturing this will disappear.

Result Interpretation

After the test has completed the best place to watch is the Aggregate Report Listener. It will hold information about how long each request, each operation and the overall test took to run. There are statistical aggregates as well like median, average and so on. These timing metrics will be the main indicators of performance.

Since we aim at stability here, the error column should show 0% entries only. If there are errors for certain row, the error column will shows something other than 0%. Switch to the result tree and look for red entries. Investigate the responses and react to the error messages visible there by adjusting IBM Cognos 10 configuration or system parameters.

To read something more from the results, you should compare the timing between different setups, such as changing the Gateway type or adding another Report Server. This will demonstrate how performance is influenced. You should always start with some baseline test. For example, perform three runs of the same test before you start reacting to the results by changing the system or configuration.

Stability wise the test should always complete without any errors. If you can see specific reports failing repeatedly, concentrate on those and disable the rest.

The fact that the test may show some errors does not mean that the product is unstable per se, it indicates further actions need to be taken to investigate those errors. Most of them can be remedied by adjusting configuration, some of them may be the result of System bottlenecks. In all cases consult Appendix A and the IBM Cognos related Technotes before taking action. Search for the error codes read from the responses. If unsure, log a call with IBM Support.

In cases where BiBusTkServerMain processes crash and leave a minidump (Windows) or a core (UNIX/Linux) you should contact IBM Support immediately and log a case with them and provide the test plan and configuration.

To sum up, the results of the test will give you an understanding of the stability of your system under certain circumstances. Remember that this test plan is just a sample and can never substitute an on-site engagement from trained and skilled IBM Cognos staff or partners running serious performance and scalability tests. Reach out to your IBM Cognos representative to arrange for that to happen if you need solid and confirmed statements regarding stability and performance.

Appendix A - Troubleshooting

This Appendix contains some of the most common error messages and hints to solve the cause of the error. It is by no means a comprehensive guide, just a quick reference. For details look up the errors in Technotes or contact IBM Support.

JMeter errors

  • JMeter GUI not fully functional, some menu options such as Open are grayed out. The console shows a Java exception.
    Action: This is an issue with the start script (jmeter.bat/.sh) which must be fixed in the script. Possibly some other JRE is taking precedence from the PATH. The author fixed this by putting set JM_LAUNCH=%JAVA_HOME%\bin\%JM_LAUNCH% into the script before it calls the JRE and set JAVA_HOME at the beginning of the script.
  • JMeter doesn't start.
    Action: Ensure you're using JRE version 1.5 or 1.6.
  • JMeter UI is slow to respond or doesn't respond at all.
    Action: Consider running JMeter from a more powerful computer or limit the number of threads (users) to 20 – 30.
  • In the Result Tree Listener, test plan elements 04, 08 and 09 don't contain any requests for fetching graphics.
    Action: The test plan elements must contain requests. Element 09 should pull in 4 charts for example. Verify that the templocation setting in the plan matches the IBM Cognos 10 configuration for temporary object location.

IBM Cognos errors, seen as error pages in the result of a request:

  • Error: DPR-ERR-2002 Unable to execute the request because there was no process available within the configured time limit. The server may be busy.
    Action: Increase the Dispatcher Queue time-out.
  • Error: RSV-ERR-0021 "The server cannot find a primary request with ID <someID>."
    Action: Check your application server thread limit is not exceeded.

Downloadable resources


Sign in or register to add and subscribe to comments.

Zone=Data and analytics, Information Management
ArticleTitle=IBM Cognos Proven Practices: IBM Cognos 10.1 BI JMeter Stability Test