Getting familiar with testing and simulation

You use Decision Validation Services to execute rulesets on usage scenarios for testing and simulation.

Testing

To validate the accuracy of a ruleset by comparing the results you expect to have with the actual results obtained during execution.

Simulations

To evaluate the potential impact of changes to rules.

You perform the tests in either Decision Center or in Rule Designer, which provides added debugging capabilities. Decision Center distinguishes between tests and simulations through test suite and simulation artifacts. In Rule Designer, this distinction is not visible. In Rule Designer you can create custom scenario providers for the Decision Center consoles, and key performance indicators (KPIs) for use in the Decision Center Enterprise console.
Note: In Rule Designer you can also run tests locally on decision operations using the Excel Tabbed format to store your scenarios. You can also deploy the XOM of a decision service for testing in the Business console.

Scenarios

A scenario represents the values of the input parameters of a ruleset, that is, the input data to ruleset execution, plus any expectations on the execution results that you want to test.

The proposed format for entering your scenarios is Excel. In the Excel format, each scenario is identified by a unique Scenario ID.

Restriction: The Excel scenario files for simulations and tests do not support XMLGregorianCalendar as a date type. When you try to create a XMLGregorianCalendar object, you get the error java.lang.InstantiationException.

The workaround is to use the java.util.Date type instead and convert it to the XMLGregorianCalendar type. For more information about the workaround, see Operational Decision Manager Known Limitations.

In the API, a scenario suite represents a list of scenarios. If you use the proposed Excel format to store your scenarios, your scenario suite becomes an Excel scenario file. These files contain a Scenarios worksheet for the values of the input data.

For example, the following Excel scenario file contains two scenarios that you want to validate:
  • A scenario named Big Loan, to see how your rules work with a big loan request.

  • A scenario named Small Loan, to see how your rules work with a small loan request.

Two scenarios in the Excel format
Note: You must generate Excel scenario files in Rule Designer or the Decision Center consoles from valid rule projects or decision operations (see Generating an Excel scenario file).
The Excel format is designed for projects with:
  • Relatively simple and small object models.

  • Testing that uses up to a few thousand scenarios.

You use a custom format in cases where the scenarios contain complex input or output parameters. You can customize the behavior of the Excel format or create custom formats. The most common example of customization is to access scenarios from a database instead of an Excel file.

Testing

You use testing to verify that a ruleset is correctly designed and written. You do so by comparing the expected results, that is, how you expect the rules to behave, and the actual results obtained when applying rules with the scenarios that you have defined.

In the Excel format, expected results display in a separate sheet alongside your scenarios. You specify the expected results that you want to test when you generate the Excel scenario file template. The following figure shows the Expected Results sheet for the big and small loan scenarios:

Expected results in Excel
In the example above, you tested three different aspects of your loan request:
  • The amount of the yearly repayment.

  • Whether the loan is approved.

  • The message generated by the loan request application.

This Expected Results sheet represents these three tests as columns, and you enter the expected results for all the scenarios necessary to cover the validation of your rules.

Similarly, the Expected Execution Details sheet includes more technical tests, for example on the list of rules executed or the duration of the execution. You also specify these tests when generating the Excel scenario file.

In Rule Designer, you run tests using dedicated launch configurations.

Simulations

You use simulations to do what-if analysis against realistic data to evaluate the potential impact of changes you might want to make to your rules. This data corresponds to your usage scenarios, and often comes from historical databases that contain real customer information.

When you run a simulation, the report that is returned provides some business-relevant interpretation of the results, based on specified key performance indicators (KPIs).

You run simulations in the Decision Center. In Rule Designer, you define KPIs and make them available to business users in the Enterprise console.

Note: You can also run simulations on decision services in the Decision Center Business console. However, the Business console does not use Decision Validation Services to run tests and simulations. You define KPIs and report formats for simulations in the Business console.

Reports

Running tests in Rule Designer sends the specified ruleset, the scenarios, and their expected results and execution details, to the Scenario Service Provider (SSP). The SSP returns all the information generated during execution. A report provides a reduced version of this information, for example:

DVS report showing scenario with 3 successful tests.

Reports contain a Summary area showing the location of the execution, the precision level, and the success rate, that is, the percentage of scenarios that were executed successfully.

Then, the Details by Scenario area shows the results of each test on each scenario.

A test can have the following results:

Successful

When the expected results match the actual results.

Failure

When the expected results are different from the actual results.

Error

When the scenarios are not executed, for example when an entry in the scenario file is not correctly formatted.