Fix Readme
Abstract
This document provides the instructions for moving IBM® Case Manager on premises to IBM Business Automation Workflow on containers 22.0.2, 23.0.1, or 23.0.2.
Content
Table of Contents
- Moving IBM Case Manager to IBM Business Automation Workflow on containers 22.0.2 or 23.x
- Assess your readiness
- Frequently asked questions
- Prepare
- Migrate
- Optional post-migration steps
- Verify your installation and migration
- Additional information
- Trademarks and service marks
Moving IBM Case Manager to IBM Business Automation Workflow on containers 22.0.2 or 23.x
You can move an on-premises IBM Case Manager environment to the stand-alone IBM Business Automation Workflow on containers and still keep the databases and LDAP of your on-premises environment. The IBM Case Manager system is composed of the IBM FileNet® Content Platform Engine, IBM Navigator, and IBM Case Manager. To move IBM Case Manager from on premises to cloud, you must also move the IBM FileNet Content Platform Engine and IBM Navigator to Business Automation Workflow on containers.
Before you move, it is important to understand what you need, what options you have, and your license entitlements.
Assess your readiness
Versions supported
For IBM Case Manager 5.3.3 Interim Fix 10 and later production environments, you can move to the following versions:
- Business Automation Workflow on containers 22.0.2 and later
- Business Automation Workflow on containers 23.0.1 or 23.0.2
Before you move Case Manager to Business Automation Workflow on containers, be aware of the prerequisites.
- Production environment
- It is recommended that you install IBM Case Manager 5.3.3 interim fix *10* or later with a production environment deployment. For more information about creating a production deployment, see Configuring IBM Case Manager.
- When you configure the Case Manager profile, select the production environment and deploy Case Manager with the default configuration tasks.
- For 22.0.2 or 23.0.1 only, ensure that the following conditions are met for all the IBM Case Manager on-premises environment case solutions that you might want to work with.
- The case solutions are imported to the Design Object Store.
- The case solutions are deployed to the required target area or areas.
- Security manifests are applied to all the target areas where the solutions are deployed.
- Databases and object stores
- The database that has the Global Configuration Database, IBM Content Navigator, Design Object Store, and Target Object Store is the same database that is used after the move. Ensure that the database will be available after the move. If you use file storage object stores, take a backup.
- The same LDAP is used in on-premises and container environments.
-
The DOCS (Documents) Object Store must be created in the on-premises environment before the move. If data persistence is configured in Application Engine, you must create the AEOS (Application Engine object store) as well.
When you move the IBM Case Manager production environment to IBM Business Automation Workflow on containers, be aware of the following limitations and restrictions.
- Solutions that use forms do not work as expected in container environments because forms are not supported.
- Object stores from the on-premises environment are not used after the move. The IBM Content Navigator URL of the on-premises environment no longer works.
- The traditional Content Platform Engine cannot be used along with the container Content Platform Engine after the move.
- For 22.0.2 or 23.0.1 only, case solutions must be deployed and the audit or security manifest applied to the required target areas before the move. After the move, the on-premises solutions cannot be redeployed and the audit or security manifest cannot be modified.
-
Case widget plug-ins must be refactored before they are deployed in Business Automation Workflow on containers. The IBM Case Manager path must be an absolute URL, not a relative URL.
Frequently asked questions
For 23.x
Prepare
This document describes the Content Platform Engine (CPE) and IBM Content Navigator (ICN) actions required for moving IBM Case Manager. If you require any other capabilities of FileNet Content Manager, see Moving from on premises to containers within FileNet Content Manager environment.
Migration planning sheet
Use the migration planning sheet to gather the on-premises environment configuration data that you must have available when you move. After you generate the CR, you must check that all the values listed in the migration planning sheet are correctly filled in. See the "Migration planning sheet."
For 22.0.2
For 23.0.1 or 23.0.2
The databases of CPE and ICN will be used after the move, so make sure that the databases are online. The Lightweight Directory Access Protocol (LDAP) server that is used in the on-premises environment will be used after the move, so make sure that the LDAP server is running. When you are ready to start the move, stop the CPE, ICN, and IBM Case Manager applications that are hosted on the IBM WebSphere or Oracle® WebLogic server.
- On the CPE server, create and initialize the required object stores, with the default add-ons.
One object store is used for DOCS and the other for Application Engine. If data persistence is configured in Application Engine, you must create the AEOS (Application Engine object store). It must exist before you move because you cannot add new object stores until the whole migration process is complete. - Export all on-premises solutions of the production environment.
- Export the security manifest and audit manifest, if any.
- Take a backup of the custom widgets.
- Take a backup of the CPE and ICN databases.
- Ensure that the following conditions are met for all the IBM Case Manager on-premises environment case solutions that you might want to work with:
- The solutions are imported to the Design Object Store.
- The solutions are deployed to the required target area or areas.
- Security manifests are applied to all the target areas where the solutions are deployed.
- To prepare for installation, see "Installing, configuring, and upgrading IBM Business Automation Workflow on containers."
For 22.0.2
For 23.0.1
For 23.0.2 - To prepare the cluster for installation, do all the steps until 4g in "Installing a production deployment."
For 22.0.2
For 23.0.1
For 23.0.2
Migrate
After you prepare the cluster, you can generate and run a custom resource (CR) file. The CR file acts as a template of what you will be installing, and can be customized according to the components that the operator supports for installation.
Generate the CR
Generate the CR for deploying Business Automation Workflow on containers on the OpenShift Container Platform. The time required to run the CR file depends on the information that you provide.
- Under the /cert-kubernetes/cert-kubernetes/scripts folder, run the baw-prerequisites.sh script.
Running the prerequisites script gives you the instructions to follow.Usage: baw-prerequisites.sh -m [modetype] Options: -h Display help -m The valid mode types are [property], [generate], [validate], or [generate-cr] STEP1: Run the script in [property] mode to create the user property files (DB/LDAP property files) with default values (database name/user). STEP2: Modify the DB/LDAP/User property files with your values. STEP3: Run the script in [generate] mode to generate the DB SQL statement files and YAML template for the secrets, based on the values in the property files. STEP4: Create the databases and secrets manually based on the modified DB SQL statement file and YAML templates for the secret. STEP5: Run the script in [validate] mode to check that the databases and secrets are created before you deploy Business Automation Workflow. STEP6: Run the script in [generate-cr] mode to generate the Business Automation Workflow custom resource based on the property files. - Run the script in property mode to generate the properties files:
./baw-prerequisites.sh -m property - For the production license that you purchased, select 1.
- Select 1 for production and 2 for non-production.
- Select the correct number for your IBM FileNet Content Manager license.
- Select the correct number for your Cloud Pak for Business Automation license.
- Select the LDAP that you use in the on-premises environment. By default, LDAP SSL is enabled. You can change the LDAP_SSL_ENABLED parameter in the /root/cert-kubernetes/scripts/baw-prerequisites/propertyfile/baw_LDAP.property file later.
What is the LDAP type that is used for this deployment? 1) Microsoft Active Directory 2) IBM Tivoli Directory Server / Security Directory Server Enter a valid option [1 to 2]: - Select the database type. By default, database SSL is enabled. You can change the DATABASE_SSL_ENABLED parameter in the /root/cert-kubernetes/scripts/baw-prerequisites/propertyfile/baw_db_server.property file later.
What is the Database type that is used for this deployment? 1) IBM Db2 Database 2) Oracle 3) Microsoft SQL Server 4) PostgreSQL - Enter an alias name for the database servers or instances to be used by the Business Automation Worfklow deployment.
Enter the alias name(s) for database server(s)/instance(s) to be used by Business Automation Workflow on containers. (NOTE: NOT the host name of database server, and CANNOT include a dot[.] character) (NOTE: This key supports comma-separated lists (for example: dbserver1,dbserver2,dbserver3) - Enter the storage class name. For more information about storage, see "Planning for IBM Business Automation Workflow on containers."
For 22.0.2
For 23.0.1
For 23.0.2To provision the persistent volumes and volume claims please enter the file storage classname for slow storage(RWX): nfs-client please enter the file storage classname for medium storage(RWX): nfs-client please enter the file storage classname for fast storage(RWX): nfs-client please enter the block storage classname for Zen(RWO): nfs-client - View your results.
The database and LDAP property files for Business Automation Workflow on containers are created, followed by the property file for each database name and user, followed by the Business Automation Workflow on containers property files.
-
Enter the <Required> values in all the files under /root/cert-kubernetes/scripts/baw-prerequisites/propertyfile.
Notes: The key names are created by the baw-prerequisites.sh script and you cannot edit them.
The values in the property files must be inside double quotation marks.Name Description baw_db_server.property
Properties for the database server used by Business Automation Workflow on containers, such as DATABASE_SERVERNAME/DATABASE_PORT/DATABASE_SSL_ENABLE.
The value of "<DB_SERVER_LIST>" is an alias for the database servers. The key supports comma-separated lists.baw_db_name_user.property
Properties for the database name and user name required by each component of Business Automation Workflow on containers deployment, such as GCD_DB_NAME/GCD_DB_USER_NAME/GCD_DB_USER_PASSWORD.
Change the <DB_SERVER_NAME> prefix to assign which database is used by the component.
The value of <DB_SERVER_NAME> must match the value of <DB_SERVER_LIST>, which is defined in the baw_db_server.property file.
The value for User/Password must not include special characters "=" "." "\"
baw_LDAP.property
Properties for the LDAP server that is used by Business Automation Workflow on containers, such as LDAP_SERVER/LDAP_PORT/LDAP_BASE_DN/LDAP_BIND_DN/LDAP_BIND_DN_PASSWORD.
The values in this file must not include special character '"''baw_user_profile.property
Properties for the global values used by the Business Automation Workflow on containers deployment, such as "sc_deployment_license".
Properties for the values used by each component of Business Automation Workflow on containers, such as <APPLOGIN_USER>/<APPLOGIN_PASSWORD>
The value for User/Password must not include special characters "=" "." "\"
The values in this file must not include special character '"'-
Update the baw_db_server.property file with the same database details as in the on-premises IBM Case Manager.
-
Update the baw_db_name_user.property file with the usernames and passwords of the GCD Database, DOCS database, DOS Database, TOS Database, ICN Database, and BAW Database.
-
Update the baw_LDAP.property file with the same LDAP details as in the on-premises IBM Case Manager.
-
Update the baw_user_profile.property file with the license, admin user (as in the on-premises IBM Case Manager), and keystore passwords.
-
-
After you update all the properties files, generate the SQL statement scripts for the databases by running the baw-prerequisites.sh file in generate mode using the following command:
./baw-prerequisites.sh -m generate - View the results.
============== Generating DB SQL Statement file required by BAW on containers based on property file ============== Creating the DB SQL statement file for Content Platform Engine global configuration database (GCD) [✔] Created the DB SQL statement file for Content Platform Engine global configuration database (GCD) Creating the DB SQL statement file for IBM Business Automation Navigator database [✔] Created the DB SQL statement file for IBM Business Automation Navigator database Creating the DB SQL statement file for Business Automation Workflow: BAWDOCS [✔] Created the DB SQL statement file for Business Automation Workflow: BAWDOCS Creating the DB SQL statement file for Business Automation Workflow: BAWDOS [✔] Created the DB SQL statement file for Business Automation Workflow: BAWDOS Creating the DB SQL statement file for Business Automation Workflow: BAWTOS [✔] Created the DB SQL statement file for Business Automation Workflow: BAWTOS Creating the DB SQL statement file for Application Engine Data Persistent [✔] Created the DB SQL statement file for Application Engine Data Persistent Creating the DB SQL statement file for Business Automation Workflow database instance1 [✔] Created the DB SQL statement file for Business Automation Workflow database instance1 Creating the DB SQL statement file for User Management Services [✔] Created the DB SQL statement file for User Management Services Creating the DB SQL statement file for Application Engine database [✔] Created the DB SQL statement file for Application Engine database [NEXT ACTIONS] * The DB SQL statement files were created under directory /root/cert-kubernetes/scripts/baw-prerequisites/dbscript, you could modify or use default setting to create database. (NOTES: PLEASE DO NOT CHANGE DBNAME/DBUSER/DBPASSWORD DIRECTLY in DB SQL statement files. PLEASE CHANGE THEM IN PROPERTY FILES IF NEEDED) ============== Generating YAML template for secret required by BAW on containers deployment based on property file ============== Creating ldap-bind-secret secret YAML template [✔] Created ldap-bind-secret secret YAML template Creating ibm-fncm-secret secret YAML template [✔] Created ibm-fncm-secret secret YAML template Creating ibm-ban-secret secret YAML template [✔] Created ibm-ban-secret secret YAML template Creating Application Engine secret YAML template [✔] Created Application Engine secret YAML template Creating Business Automation Workflow secret YAML template [✔] Created Business Automation Workflow secret YAML template Creating UMS secret YAML template [✔] Created UMS secret YAML template [NEXT ACTIONS] * Enter the <Required> values in the YAML templates for the secrets under /root/cert-kubernetes/scripts/baw-prerequisites/secret_template * You can use this shell script to create the secret automatically: /root/cert-kubernetes/scripts/baw-prerequisites/create_secret.sh * Create the databases and Kubernetes secrets manually based on your modified "DB SQL statement file" and "YAML template for secret". * And then run "baw-prerequisites.sh -m validate" command to verify that the databases and secrets are created correctly - Run the generated scripts for your specific database type. The scripts are created under /root/cert-kubernetes/scripts/baw-prerequisites/dbscript/baw. Copy the scripts to your database server and run them. (The CPE databases are already available.)
Note: Don't change DBNAME, DBUSER, or DBPASSWORD directly in the files. Change them in the property files if needed. -
To create the required secrets, run the create_secret.sh file in /root/cert-kubernetes/scripts/baw-prerequisites.
create_secret.sh dbscript propertyfile secret_template -
View the results.
====================================================================== ******************************* START ************************************** [INFO] Applying YAML template file: /root/cert-kubernetes/scripts/baw-prerequisites/secret_template/ibm-ldap-bind-secret.yaml secret/ldap-bind-secret configured ******************************** END *************************************** **************************************************************************** **************************************************************************** ******************************* START ************************************** [INFO] Applying YAML template file: /root/cert-kubernetes/scripts/baw-prerequisites/secret_template/fncm/ibm-fncm-secret.yaml secret/ibm-fncm-secret configured ******************************** END *************************************** **************************************************************************** **************************************************************************** ******************************* START ************************************** [INFO] Applying YAML template file: /root/cert-kubernetes/scripts/baw-prerequisites/secret_template/ban/ibm-ban-secret.yaml secret/ibm-ban-secret configured ******************************** END *************************************** **************************************************************************** **************************************************************************** ******************************* START ************************************** [INFO] Applying YAML template file: /root/cert-kubernetes/scripts/baw-prerequisites/secret_template/ae/ibm-aae-app-engine-secret.yaml secret/icp4adeploy-workspace-aae-app-engine-admin-secret configured ******************************** END *************************************** **************************************************************************** **************************************************************************** ******************************* START ************************************** [INFO] Applying YAML template file: /root/cert-kubernetes/scripts/baw-prerequisites/secret_template/baw-std/ibm-baw-db-secret.yaml secret/ibm-baw-wfs-server-db-secret configured ******************************** END *************************************** **************************************************************************** **************************************************************************** ******************************* START ************************************** [INFO] Applying YAML template file: /root/cert-kubernetes/scripts/baw-prerequisites/secret_template/ums/ibm-ums-db-secret.yaml secret/ibm-dba-ums-secret configured ******************************** END *************************************** **************************************************************************** -
To verify the secrets and the database connections before you generate the CR, run the validate command.
./baw-prerequisites.sh -m validate -
View the results.
./baw-prerequisites.sh -m validate ***************************************************** Validating the prerequisites before you install BAW ***************************************************** ============== Checking license required by CP4BA ============== [✔] The license for the CP4A deployment: production [✔] The license for IBM Business Automation Workflow (BAW): production [✔] The license for FileNet Content Manager (FNCM): production ============== Checking the Kubernetes secret required by IBM Business Automation Workflow on containers existing in cluster or not ============== [✔] Found secret "ldap-bind-secret" in Kubernetes cluster, PASSED! [✔] Found secret "ibm-fncm-secret" in Kubernetes cluster, PASSED! [✔] Found secret "ibm-ban-secret" in Kubernetes cluster, PASSED! [✔] Found secret "icp4adeploy-workspace-aae-app-engine-admin-secret" in Kubernetes cluster, PASSED! [✔] Found secret "ibm-baw-wfs-server-db-secret" in Kubernetes cluster, PASSED! [✔] Found secret "ibm-dba-ums-secret" in Kubernetes cluster, PASSED! ============== All secrets created in Kubernetes cluster, PASSED! ============== ============== Checking LDAP connection required by CP4BA ============== Checking connection for LDAP server "newldap1.fyre.ibm.com" using Bind DN "cn=root".. Binding... Binding with principal: cn=root Connected to: ldap://newldap1.fyre.ibm.com:389 Binding took 84 ms Total time taken: 84 ms [✔] Connected to LDAP "newldap1.fyre.ibm.com" using BindDN:"cn=root" successfuly, PASSED! ============== Checking DB connection ============== Checking connection for db2 database "GCD" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "GCD" on database host server "icm533ps1.fyre.ibm.com", PASSED! Checking connection for db2 database "DOCS" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "DOCS" on database host server "icm533ps1.fyre.ibm.com", PASSED! Checking connection for db2 database "DOS1" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "DOS1" on database host server "icm533ps1.fyre.ibm.com", PASSED! Checking connection for db2 database "TOS1" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "TOS1" on database host server "icm533ps1.fyre.ibm.com", PASSED! Checking connection for db2 database "AEOS" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "AEOS" on database host server "icm533ps1.fyre.ibm.com", PASSED! Checking connection for db2 database "ECMDB" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "ECMDB" on database host server "icm533ps1.fyre.ibm.com", PASSED! Checking connection for db2 database "AAEDB" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "AAEDB" on database host server "icm533ps1.fyre.ibm.com", PASSED! Checking connection for db2 database "BAWDB" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "BAWDB" on database host server "icm533ps1.fyre.ibm.com", PASSED! Checking connection for db2 database "OAUTH2DB" belongs to database server "icm533ps1" which defined in <DB_SERVER_LIST>.... Connected to the database Success! [✔] Checked DB connection for "OAUTH2DB" on database host server "icm533ps1.fyre.ibm.com", PASSED! [INFO] If all prerequisites check PASSED, you can run cp4a-deployment to deploy CP4BA. Otherwise, please check configuration again. -
From the /scripts folder, generate the CR using the following command:
./baw-prerequisites.sh -m generate-cr -
Review the IBM Business Automation Workflow license information. Enter Yes to accept it.
-
Select the platform to deploy on.
-
Select the profile size. For more information about profile sizes, see "Planning for IBM Business Automation Workflow on containers."
For 22.0.2
For 23.0.1
For 23.0.2 -
Enter the deployment hostname suffix.
-
View the results.
======================================================================== IMPORTANT: Review the IBM Business Automation Workflow license information here: http://www14.software.ibm.com/cgi-bin/weblap/lap.pl?li_formnum=L-XDJC-8EJ6CN Press any key to continue Do you accept the IBM Business Automation Workflow license (Yes/No, default: No): Yes Select the cloud platform to deploy: 1) Openshift Container Platform (OCP) - Private Cloud 2) Other ( Certified Kubernetes Cloud Platform / CNCF) Enter a valid option [1 to 2]: 1 [✔] Selected platform: OCP Please select the deployment profile (default: small). Refer to the documentation in IBM Business Automation Workflow on containers Knowledge Center for details on profile. 1) small 2) medium 3) large Enter a valid option [1 to 3]: 1 [✔] Selected profile: small Please enter the deployment hostname suffix: dev.apps.bawfvt23.cp.fyre.ibm.com [✔] Collected the deployment hostname suffix: dev.apps.bawfvt23.cp.fyre.ibm.com Applying value in property file into final CR [✔] Applied value in property file into final CR under /root/cert-kubernetes/scripts/baw-prerequisites/generated-cr [NEXT ACTIONS] Please confirm final custom resource under /root/cert-kubernetes/scripts/baw-prerequisites/generated-cr After done, press any key to next! Press any key to continueWhen you press a key, the CR file is generated under /root/cert-kubernetes/scripts/baw-prerequisites/generated-cr.
Modify the following fields in the CR based on the on-premises data.
-
The Content Initialization flag must be set to false.
shared_configuration: sc_content_initialization: false -
The ICN Datasource name must be the same as in the on-premises environment.
datasource_configuration: dc_icn_datasource dc_common_icn_datasource_name: "ECMClientDS" -
The GCD Datasource name must be the same as in the on-premises environment.
datasource_configuration: dc_gcd_datasource dc_common_gcd_datasource_name: "FNGCDDS" dc_common_gcd_xa_datasource_name: "FNGCDDSXA" -
The DOCS Object store datasource must be the same as in the on-premises environment.
dc_os_datasources: dc_os_label: "docs" dc_common_os_datasource_name: "FNDOCSDS" dc_common_os_xa_datasource_name: "FNDOCSDSXA" -
The DOS Object store datasource must be the same as in the on-premises environment.
dc_os_datasources: dc_os_label: "dos" dc_common_os_datasource_name: "FNDOSDS" dc_common_os_xa_datasource_name: "FNDOSDSXA" -
The TOS Object store datasource must be the same as in the on-premises environment.
dc_os_datasources: dc_os_label: "tos" dc_common_os_datasource_name: "FNTOSDS" dc_common_os_xa_datasource_name: "FNTOSDSXA" -
The Case History Datasource and Connection name must be the same as in the on-premises environment.
dc_cpe_datasources: dc_database_type: "db2" dc_os_label: "ch" dc_common_cpe_xa_datasource_name: "FNCHDSXA" dc_common_cpe_datasource_name: "FNCHDS" database_servername: "icmsuser1.fyre.ibm.com" database_port: "50000" dc_common_conn_name: "CHDB_connection" -
The navigator configuration section must be filled in with the on-premises schema and tablespace names.
navigator_configuration: icn_production_setting: timezone: Etc/UTC icn_db_type: oracle icn_jndids_name: ECMClientDS1 icn_schema: ECMDB #As on-prem icn_table_space: ECMDB #As on-prem icn-admin: P8Admin license: accept -
The content integration section must be filled in with the same object store names as in the on-premises environment.
content_integration: ## Domain name for content integration. The value must be the same as initialize_configuration.ic_domain_creation.domain_name. domain_name: "P8Domain" # Object Store name for content integration. # The value must be an existing object store in Content Platform Engine. # If use initialize_configuration for the object store initialization, the value must be one of initialize_configuration.ic_obj_store_creation.object_stores. object_store_name: "DOCS" ##The configuration for case -
In 22.0.2 or 23.0.1, the case integration section must be filled in with the same object store names as in the on-premises environment.
case: ## Domain name for CASE. The value must be the same as initialize_configuration.ic_domain_creation.domain_name. domain name: "P8Domain" ## Design Object Store name of CASE. ## The value must be the same as the oc_cpe_obj_store_symb_name value of one of the object stores defined in initialize_configuration.ic_obj_store_creation.object_stores. object_store_name_dos: "DOS" ## Target Object Store name of CASE. ## The value must be the same as the oc_cpe_obj_store_symb_name value of one of the object stores defined in initialize_configuration.ic_obj_store_creation.object_stores. object_store_name_tos: "TOS" connection_point_name_tos: "TOSConntPnt" -
In 23.0.2, multiple target object stores are supported. The case integration section must be filled in with the same object store names as in the on-premises environment.
case: ## Domain name for CASE. The value must be the same as initialize_configuration.ic_domain_creation.domain_name. domain name: "P8Domain" ## Design Object Store name of CASE. ## The value must be the same as the oc_cpe_obj_store_symb_name value of one of the object stores defined in initialize_configuration.ic_obj_store_creation.object_stores. object_store_name_dos: "DOS" ## Target Object Store name of CASE. ## The value must be the same as the oc_cpe_obj_store_symb_name value of one of the object stores defined in initialize_configuration.ic_obj_store_creation.object_stores. tos_list - object_store_name_tos: "TOS" connection_point_name: "TOSConntPnt" desktop_id: "baw" target_environment_name: "target_env" is_default: trueIf there are multiple target object stores, append the target object store details for each of them, as shown in the following example.
Note: Make sure that the CR has a corresponding datasource section for each target object store, and each of them also has a corresponding secret in the ibm-fncm-secret file.
For three target object stores:case: ## Domain name for CASE. The value must be the same as initialize_configuration.ic_domain_creation.domain_name. domain name: "P8Domain" ## Design Object Store name of CASE. ## The value must be the same as the oc_cpe_obj_store_symb_name value of one of the object stores defined in initialize_configuration.ic_obj_store_creation.object_stores. tos_list - object_store_name: "BAWINS1TOS" connection_point_name: "TOS_Connt_Pnt" desktop_id: "baw" target_environment_name: "target_env" is_default: "true" - object_store_name: "BAWINS1TOS2" connection_point_name: "TOS2_Connt_Pnt" desktop_id: "bawfvt" target_environment_name: "target_env2" is_default: "false" - object_store_name: "BAWINS1TOS3" connection_point_name: "TOS3_Connt_Pnt" desktop_id: "bawtest" target_environment_name: "target_env3" is_default: "false" -
Remove the initialize_configuration section from the CR .
This section is not needed because you are reusing the existing FileNet domain, Object stores, and LDAP. -
Update the TOS name in the CR.
federation_config: workflow_server: case_manager: - object_store_name: TOS - In the ibm-fncm-secret file, add the secret for ch db.
You must validate your CR file before you apply it or save it in the YAML view. It is likely that you edited the file multiple times, and possibly introduced errors or missed values during your customizations. For more information, see "Validating the YAML in your custom resource file."
For 22.0.2
For 23.0.1
For 23.0.2
To deploy the CR, apply the upgraded CR to the operator. For example, using the OpenShift CLI:
oc apply -f ibm_cp4a_cr_production_FC_workflow-standalone.yaml
To verify that the installation was successful, follow the steps in "Verifying the installation of IBM Business Automation Workflow on containers."
For 22.0.2
For 23.0.1
For 23.0.2
You must perform post installation steps if you work with file storage Target Object Stores, or if you are moving to 22.0.2 or 23.0.1 and you have multiple Target Object Stores.
If you use a file storage Target Object Store: Log in to ACCE. After you log in, update the path in the TOSStorage, Properties tab of the file storage device to point to the location in the Content Platform Engine physical volume. The file storage backup is now copied to the Content Platform Engine physical volume.
For 22.0.2 or 23.0.1, if you use more than one Target Object Store: Perform the following mandatory steps.
If there are any file storage target areas, update the path in the TOSStorage, Properties tab and then run the case initjob. In a multiple Target Object Store environment, the case initjob must be run for all the target areas as described in the following steps:
- Export the case initJob.json file:
- Save the name of the case-init job into a variable:
caseinitjobname=$(oc get job | grep case-init | awk '{print $1}') - Export the case-init job json file:
oc get job ${caseinitjobname} -o json > initJob.json -
Edit the initJob.json file. Remove any lines that contain a controller-uid string.
The controller-uid is used internally by Kubernetes to track and manage the controller state and to handle reconciliation processes efficiently. Without removing the controller-uid from exported json file, you cannot run the case-init.job successfully with the changes. - Update or add the following environment variables in env: array or container.env: array:
{ "name": "TOS_OBJECTSTORE_NAME", "value": "<TOS_name>" }, { "name": "CONN_POINT_NAME", "value": "<connection_point_name>" }, { "name": "TARGET_ENV_NAME", "value": "<target_environment_name>" }, - If the environment is a Workflow Authoring environment, set the ADD_OS_ADMINS_IN_DEFAULT_PA value to false.
- Save your updates.
- In 22.0.2, delete the config.ok file from the following directory: /nfsdata/icn/pluginstore/properties
In 23.x, delete the config.ok file from the following directory: /opt/ibm/plugins/properties directory on the Business Automation Navigator (BAN) pod. - Run the following command:
oc replace --force -f initJob.jsonCase init automatically configures the project area or target environment for the additional target object stores to the default case client desktop (desktop=baw). To change the desktop for the project area or target environment, run Register Project Area or Register Target Environment in the case administration client for each additional target object store. - To configure a new Business Automation Navigator desktop for an additional project area:
- Log in to the Business Automation Navigator admin desktop.
- Copy the out-of-the-box desktop Business Automation Workflow.
- Edit the new desktop. On the General tab, specify the Name and ID.
- On the Connections tab, add the new target object store and remove the out-of-the-box target object store.
- Log in to the bawadmin desktop.
- Navigate to the design object store.
- Right-click the connection definition for the new project area and click Register.
- In the next dialog, enter the configuration parameters for the new project area.
- Name of the associated Business Automation Navigator desktop
- Case operations username
- Password - Click Next. In the next dialog, enter the configuration parameters for the Business Automation Workflow server and context roots.
Optional post-migration steps
After migration, you can work with custom plug-ins and extensions, enable CA/CH stores, index case instances, enable case analyzer and case history, and enable case event emitters.Work with custom plug-ins and custom extensions
With custom plug-ins and custom extensions in the production environment, you must copy the plug-ins to /<icn pv directory>/<icn-pluginstore>. Add lines similar to the following example lines in the CR file under case configuration and deploy the CR file.
custom_package_names: "ICMCustomWidgets.zip"
custom_extension_names: "CustomEditors.zip"For more details about custom packages and custom extensions, see "Configuring custom case widgets for a container environment".
For 22.0.2
For 23.0.1
For 23.0.2
Enable the case analyzer and case history stores in the ACCE- In the Administrative Console for Content Platform Engine, go to FileNet P8 domain > Workflow Subsystem.
- Select Case History enabled and Case Analyzer enabled.
- Restart the Content Platform Engine pod to reflect the changes.
Configure the emitter
For the Case event emitter configuration parameters, see "IBM Business Automation Workflow Runtime and Workstream Services parameters."
For 22.0.2
For 23.0.1
For 23.0.2The case management tools provide support for indexing case instances in the Elasticsearch index. Full re-indexing and live index updates are supported. For more information, see Indexing case instances.
- Save the name of the case-init job into a variable:
Additional Information
Trademarks and service marks
Product Synonym
IBM BAW, Business Automation Workflow
Was this topic helpful?
Document Information
Modified date:
19 March 2024
UID
ibm17060468