Overview of testing and simulation

Business users must be confident that the decision services are written correctly and that any update does not break the business logic encapsulated in the rulesets. Business users validate the business logic against well-defined usage scenarios, by running tests and simulations against their rules.

Testing

To validate the accuracy of a ruleset by comparing the results you expect to have with the actual results obtained during execution.

Simulations

To evaluate the potential impact of changes to rules.

You perform the tests and simulations in Decision Center, through test suite and simulation artifacts. In Rule Designer, you must configure the business object model (BOM) to generate Excel scenario files, and you can deploy the XOM to enable remote testing. You can also create custom data providers for the Decision Center Business console.

Scenarios

A scenario represents the values of the input parameters of a ruleset, that is, the input data to ruleset execution, plus any expectations on the execution results that you want to test.

The proposed format for entering your scenarios is Excel. In the Excel format, each scenario is identified by a unique Scenario ID.
Restriction: The Excel scenario files for simulations and tests do not support XMLGregorianCalendar as a date type. When you try to create a XMLGregorianCalendar object, you get the error java.lang.InstantiationException.

The workaround is to use the java.util.Date type instead and convert it to the XMLGregorianCalendar type. For more information about the workaround, see Operational Decision Manager Known Limitaions.

In the API, a scenario suite represents a list of scenarios. If you use the proposed Excel format to store your scenarios, your scenario suite becomes an Excel scenario file. These files contain a Scenarios worksheet for the values of the input data.

For example, the following Excel scenario file contains two scenarios that you want to validate:
  • A scenario named Big Loan, to see how your rules work with a big loan request.

  • A scenario named Small Loan, to see how your rules work with a small loan request.

Two scenarios in the Excel format
Note: You must generate Excel scenario files in Rule Designer or the Decision Center consoles from valid rule projects or decision operations (see Generating an Excel scenario file).
The Excel format is designed for projects with:
  • Relatively simple and small object models.

  • Testing that uses up to a few thousand scenarios.

You can customize the scenario data provider to a database other than an Excel file.

Testing

You use testing to verify that a ruleset is correctly designed and written. You do so by comparing the expected results, that is, how you expect the rules to behave, and the actual results obtained when applying rules with the scenarios that you have defined.

In the Excel format, expected results display in a separate sheet alongside your scenarios. You specify the expected results that you want to test when you generate the Excel scenario file template. The following figure shows the Expected Results sheet for the big and small loan scenarios:

Expected results in Excel
In the example above, you tested three different aspects of your loan request:
  • The amount of the yearly repayment.

  • Whether the loan is approved.

  • The message generated by the loan request application.

This Expected Results sheet represents these three tests as columns, and you enter the expected results for all the scenarios necessary to cover the validation of your rules.

Similarly, the Expected Execution Details sheet includes more technical tests, for example on the list of rules executed or the duration of the execution. You also specify these tests when generating the Excel scenario file.

Simulations

You use simulations to do what-if analysis against realistic data to evaluate the potential impact of changes you might want to make to your rules. This data corresponds to your usage scenarios, and often comes from historical databases that contain real customer information.

When you run a simulation, the report that is returned provides some business-relevant interpretation of the results, based on specified key performance indicators (KPIs).

You run simulations in the Decision Center Business console, where you define KPIs and report formats.

Reports

When you run tests and simulations in the Decision Center Business console, a report is generated and provides information on scenarios, and their expected results and execution details.

Reports contain a Summary area showing the location of the execution, the precision level, and the success rate, that is, the percentage of scenarios that were executed successfully.

Then, the Results area shows the results of each test on each scenario.

A test can have the following results:

Successful

When the expected results match the actual results.

Failure

When the expected results are different from the actual results.

Error

When the scenarios are not executed, for example when an entry in the scenario file is not correctly formatted.