IBM Operational Decision Manager Tips and Tricks
AndyFlatt 270003RUB0 1.906 Visualizações
Welcome to the Operational Decision Manager Tips and Tricks Blog.
This is a blog for IBM Software developers and technical professionals to share their own thoughts and comments on Operational Decision Manager with an internal and external audience. The blog aims to publish technical information to answer common questions about how to put Operational Decision Manager into development, test and production. Please take advantage of the blog to reach out to the community with questions, thoughts and comments. If you find this blog interesting it can be followed by clicking the button in the top right. If you are new to blogging you can read more about how to interact with blogs here:
Initially we are looking for posts from IBMers working with ODM, and in future we plan to invite our user base to contribute too. If you are interested in publishing information to this blog, please contact Andy Flatt (firstname.lastname@example.org).
The views and opinions expressed on this blog are soley those of the original authors and other contributors. These views and opinions do not necessarily represent those of IBM, the IBM staff, and/or any/all contributors to this site.
Could you explain the difference between business rules and event rules? What are the differences from a business context, and when it is appropriate to use one over the other?
We recently built a sample application in IBM Operational Decision Manager for the IBM Redbooks publication Implementing an Advanced Application Using Processes, Rules, Events, and Reports. The sample application was a warranty reporting solution that processes post-puchase warranty claims for product repair. In the sample application we defined two rules:
if the value of a warranty claim is greater than $1,000 then set manager approval to mandatory
if the number of warranty claims of the customer within the last month is greater than 4 then send a request to the call center to contact the customer
From a business user’s perspective, both of the above are rules. They reflect a business policy that the warranty system needs to implement. But there is a fundamental difference. The first example is a business rule and the second is an event rule. The distinguishing difference between the two is that the first has no reference to time, while the second does (for example “within the last month”).
The way we use business rules and event rules in the context of a business process is fundamentally different too. Business rules are typically called from a process in the same way any external, stateless service would be called: the process sends data to a Decision Service and receives a synchronous response.
Conversely, we can use business events to trigger a business process. The business event detects a certain situation that requires prompt action, and invokes the appropriate business process. This is an example of a sense-and-response pattern.
By combining both business rules and business events we are able to use rules to trigger business processes, and to incorporate rules that determine how a business processes should run. A powerful combination indeed.
Martin Keen is an IBM Redbooks Project Leader. He leads publications on many areas of IBM middleware. Follow Martin on Twitter at @MartinRTP.
AndyFlatt 270003RUB0 Marcações:  manager zos ilog z/os cics jrules operational java odm decision program 5.798 Visualizações
This blog discusses how to build a scenario where IBM Operational Decision Manager (ODM) rules are invoked from a CICS Java program.
The scenario currently runs the COBOL mini loan sample in a zRules Execution Server (zRES) running in a CICS JVM Server. This is shown in Figure 1.
The CICS Java miniloan program will call the same ODM rule application used by the COBOL miniloan sample. The scenario creates a J2SE ODM rule engine with the same configuration as the zRES . The Java application will create its own J2SE eXecution Unit (XU) exclusively for this Java application. This is shown in Figure 2.
By using the same ra.xml configuration file, the J2SE XU will have access to the same management and rule deployment options as the existing zRES.
The way this scenario is implemented, each Java application creates its own XU on first execution. The Java application will reuse the XU on subsequent executions.
Prepare your environment
You will need to use the ODM for z/OS infocenter to create a working ODM in CICS configuration. Ensure that you have completed all these steps before you continue:
Import the Miniloan Java XOM
The first step is to import the z/OS miniloan sample into eclipse so that we can use it to build this example. In order to do this you must have the Rule Designer eclipse environment installed with the samples and tutorials. The samples and tutorials are an installation option provided by the IBM installation manager.
Create the Java project
The next step is to create a Java project and configure the build path, this can be done by opening the Java perspective in Rule Designer.
Modify the XOM
It is useful if the XOM's borrower and loan classes override the Object.toString method. This allows us to easily print the borrower and loan to the logs to verify rule execution. To do this open the MiniLoanDemo-xom project and edit the Borrower.java and Loan.java files.
Create the Java CICS Application code
We are now going to create a single static class to demonstrate the minimal code required to call rules using a J2SE XU with a stateless session.
Configure CICS to run the Java mini loan application
The next step is to configure CICS so that it knows about the new Java application. This is done by telling the JVM profile about the jar file and creating the CICS resources required.
The logging for the XU will be outputted to the dfhjvmerr file, this can be configured by providing the XU with extra logging information. To check that the XU did not encounter any errors check the dfhjvmerr log for SEVERE or ERROR messages.
Once the application has been run once, the XU will appear in the Server Info tab of the zRules Execution Server management console. You are then able to run Diagnostics and update the miniloan application and have any rule updates picked up by the application.
I was recently asked if it was possible to speed up migrations of the Rule Team Server (RTS) database between versions of IBM Operational Decision Manager (ODM). Given that the RTS database maintains a version history for each artefact, if you've been using the product for a while and actively making changes, it's possible that you've generated a large amount of history that will then be time-consuming to migrate.
In this article, I will provide a step-by-step example of how to use the supplied archiving tools to minimise the size of your RTS database prior to migration. The archiving tools work by copying all active rules and any rule history up to a date you specify into a new database schema, effectively performing a pruning operation. The new schema is created within your existing database, and then SQL statements move the data between the existing and new tables. All baselines which either reference an active rule or contain history more recent than the supplied archiving date are preserved and any data stored in an extension model will also be preserved.
In this example, I am using WebSphere ILOG JRules 7.1.1 on RHEL 6.3 with DB2 10.5 and WebSphere Application Server 8.5. Hopefully you should be able to easily transpose these instructions onto your own environment.
There is a page in the InfoCenter which covers this topic, which you can review here.
Step by step example
1) Prepare your environment
The archiving tools make use of some Ant tasks, so start by making sure you're environment is configured correctly to use Ant.
2) Backup your RTS database
The archiving process involves creating a new database schema, and then running a number of SQL scripts against your RTS database. Before you start, take a backup of your RTS database using your favourite method. For DB2, I stopped all connections to the database using db2 force application all and used the DB2 backup command in a newly created directory: db2 backup database <db_name>.
3) Create a new database user
This user will be used to create the new schema. For DB2 on RHEL, this was as simple as running adduser <newuser> (I used archive on my third attempt, if it's been a while, I suggest you read the DB2 user name restrictions).
4) Create a new schema in your existing database for the archive
To create the new schema into which your active rules and selected history will be copied you have two choices:
Both methods create the new schema based on the username supplied for the RTS JEE data source. The new schema is given the same name as the user who's credentials are used to access the data source. The new schema must be created in the same database as the existingl schema.
I used the Installation Settings wizard. To do this, I first changed the JDBC name of my existing RTS datasource from jdbc/ilogDataSource to jdbc/ilogDataSourceOriginal and then created a new data source "jdbc/ilogDataSource" using a new J2C authentication alias for component-managed authentication pointing at my new database user.
(the rtsArchive J2C ID has the username archive)
After a stop and restart of WAS, visiting the Enterprise Console presented me with the Installation Settings wizard so that I could run through the steps.
If you use the set-extensions Ant task instead, this connects to the data source you specify and then executes SQL to create the tables and upload extensions. Here's an example:
ant set-extensions -Dserver.url=http://localhost:9080/teamserver -DdatasourceName=jdbc/ilogDataSource -DextensionModel=defaultExtension.brmx -DextensionData=defaultExtension.brdx
IMPORTANT: If you're existing RTS database uses an extension model, you must create the new schema using the same extension model and data (the defaults reside in <install_dir>/teamserver/bin).
5) Run the gen-archive-repository-role Ant task and execute the resulting SQL
This Ant task generates a SQL script that creates a new role, and then grants privileges to that role to perform certain operations on your existing RTS database. The generated SQL must be executed against your RTS database by a user with administrative rights.
I'm providing a fully-qualified command line version here for you to modify and use, but note that you can set any of the -D properties in the <install_dir>/teamserver/bin/teamserver-anttasks.properties file instead. If you do this, you can omit them from the command line.
ant gen-archive-repository-role -Dserver.url=http://localhost:9080/teamserver -DdatasourceName=<target_datasource> -DoldDatabaseSchemaName=<existing_db_username> -DoutputFile=gen-archive-repository-role.sql
I ran the generated SQL against DB2 using the command line:
6) Run the gen-archive-repository-script Ant task and execute the resulting SQL
This Ant task generates the SQL which copies artefacts from the existing RTS database and into the new schema. Once the script has been generated you can either execute it with the execute-schema Ant task or use your favourite database tooling.
ant gen-archive-repository-script -Dserver.url=http://localhost:9080/teamserver -DdatasourceName=<target_datasource> -DoldDatabaseSchemaName=<existing_db_username> -DarchiveDate=<YYYY-MM-DD> -DoutputFile=gen-archive-repository-script.sql
For our purposes here of pruning the history, the key property is archiveDate. If this was set to 2013-11-11, in addition to all active rules, history for all artefacts newer than this date will be copied into the new schema.
Again, I ran the generate SQL against DB2 using the command line:
IMPORTANT: Ensure you run the generated SQL in this script as the new database user created in step 3.
Once this script had completed, I went back to the Enterprise Console and checked out the history for one of my rules:
Mark-Hiscock 120000MHHM 1.194 Visualizações
Sometimes DB2 has not been configured properly for JDBC access. This can happen because ODM is the first program product installed on z/OS to use a JDBC connection to the database.
The main issue is that DB2 is not able to support JDBC connections and the following error message is received in the RES Console: “SQL Error: Unknown column name TABLE_NAME. ERRORCODE=-4460 ”
This issue can be resolved by modifying the DESCSTAT DB2 property which must be set to YES and a rebind performed of the plans required for JDBC