Preparing for migration in InfoSphere Information Server (Linux, AIX)
Before you export any data from InfoSphere Information Server to Cloud Pak for Data, complete a set of setup tasks.
Before you begin
To complete these tasks, you must be logged in to the InfoSphere Information Server node as root.
Also, make sure that the prerequisites listed in Prerequisites to migrating data are met.
Tasks to complete on the InfoSphere Information Server system before migrating
- Setting environment variables
- Installing required tools
- Increasing the expiry time for the CSRF token
- Removing invalid users
- Assigning the Suite User role to users with inherited roles
- Granting access to all data quality projects
- Improving export performance
- Installing the migration toolkit
- Running the initialization script
- Checking data integrity
- Increasing the timeout value for the LTPA token
- Installing required software packages
- Creating a db2dsdriver.cfg configuration file
- Determining the scope of export
Setting environment variables
Complete the following steps. Follow the instructions for your operating system:
- Log in to the InfoSphere Information Server node as
root. - Open a bash shell:
bash - Set the following environment variables.
IIS_INSTALL_PATH=<IIS installation path> IIS_HOST=<IIS host> IIS_PORT=<IIS port> IIS_USERNAME=<IIS username> IIS_PASSWORD=<IIS password> TOOLKIT_PATH=<directory for storing the toolkit content; the directory must not be under the /root path>IIS_INSTALL_PATHexample: If InfoSphere Information Server is installed in the default location, set theIIS_INSTALL_PATHvariable to the value /opt/IBM/InformationServer.
Installing required tools
Download and install the required tools for your operating system as root user.
Complete the steps that apply to your operating system.
- Red Hat® Enterprise Linux
-
Install the
jqutility.- Change to the
${TOOLKIT_PATH}directory.cd ${TOOLKIT_PATH} - To install the utility, run the following
commands:
curl -LO https://github.com/jqlang/jq/releases/download/jq-1.7.1/jq-linux-i386 chmod +x ./jq-linux-i386 cp jq-linux-i386 jq
- Change to the
- AIX
-
Install the
wget,curl,jq, anddos2unixutilities. Then, run thedos2unixtool to convert the database.properties file into the required format.- Change to the
${TOOLKIT_PATH}directory.cd ${TOOLKIT_PATH} - To install the utilities, run the following
commands:
dnf install wget -y dnf install curl -y dnf install jq -y dnf install dos2unix -y dos2unix ${IIS_INSTALL_PATH}/ASBServer/conf/database.properties
- Change to the
- SUSE Linux and SUSE Linux on System z
-
Install the
jqutility.- Change to the
${TOOLKIT_PATH}directory.cd ${TOOLKIT_PATH} - To install the utility, run the following
commands:
zypper install jq
- Change to the
- Red Hat Linux on System z
-
Install the
jqutility.- Change to the
${TOOLKIT_PATH}directory.cd ${TOOLKIT_PATH} - To install the utility, run the following
commands:
curl -LO https://github.com/jqlang/jq/releases/download/jq-1.7/jq-linux-s390x chmod +x ./jq-linux-s390x cp jq-linux-s390x jq
- Change to the
Increasing the expiry time for the CSRF token
Increase the expiry time for the CSRF token. Run the commands that apply to your environment.
- Set the expiry time to 600 seconds by running the following
command:
${IIS_INSTALL_PATH}/ASBServer/bin/iisAdmin.sh -set -key com.ibm.iis.isf.security.CsrfTokenExpiryTime -value 600 - Confirm the setting by running the following
command:
${IIS_INSTALL_PATH}/ASBServer/bin/iisAdmin.sh -d | grep com.ibm.iis.isf.security.CsrfTokenExpiryTime
Removing invalid users
Remove all invalid users from the user registry. Run the commands that apply to your environment.
- Get the list of invalid
users.
${IIS_INSTALL_PATH}/ASBServer/bin/DirectorySync.sh -url https://${IIS_HOST}:${IIS_PORT} -user ${IIS_USERNAME} -password ${IIS_PASSWORD} -giu - Delete all users that were returned in the previous step. Pass the usernames as a
tilde-delimited list to the DirectorySync.sh script. If the entries to be
deleted are long full DN names, enclose each username in double quotation marks
(").
${IIS_INSTALL_PATH}/ASBServer/bin/DirectorySync.sh -user ${IIS_USERNAME} -password ${IIS_PASSWORD} -url https://${IIS_HOST}:${IIS_PORT} -delete_user_ids user1~user2~…userN
Assigning the Suite User role to users with inherited roles
Assign the Suite User role to all users that do not have any security roles assigned directly but inherit the roles from the groups that they are part of.
Run the commands that apply to your environment.
- Get the list of users without direct role
assignments:
${IIS_INSTALL_PATH}/ASBServer/bin/UsersSync.sh -url https://${IIS_HOST}:${IIS_PORT} -user ${IIS_USERNAME} -password ${IIS_PASSWORD} -list USERS - Assign the Suite User role to the users returned in the previous
step:
${IIS_INSTALL_PATH}/ASBServer/bin/UsersSync.sh -url https://${IIS_HOST}:${IIS_PORT} -user ${IIS_USERNAME} -password ${IIS_PASSWORD} -list USERS -sync
Granting access to all data quality projects
Grant access to all data quality projects. Run the commands that apply to your environment.
${IIS_INSTALL_PATH}/ASBServer/bin/iisAdmin.sh -set -key com.ibm.iis.ia.server.accessAllProjects -value true
${IIS_INSTALL_PATH}/ASBServer/bin/iisAdmin.sh -set -key com.ibm.iis.ismigration -value true
Improving export performance
To improve the performance of the migration during the export, complete these steps as
root.
- Create additional indexes in the metadata repository. Complete the following steps depending on
where your metadata repository is hosted. These steps must be completed on the InfoSphere Information Server services tier:
- Metadata repository on Db2
- Run these
xmetaAdmincommands:cd ${IIS_INSTALL_PATH}/ASBServer/bin ./xmetaAdmin.sh addIndex -model ASCLModel -class DataFileFolder importedVia_DataConnection ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model ASCLModel -class DataConnection accesses_DataStore ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model DataStageX -class DSDataConnection accesses_DataStore ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model ASCLModel -class DataCollection of_PhysicalModel ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model ASCLLogicalModel -class Relationship of_LogicalModel ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model ASCLModel -class HostSystem name ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model ASCLModel -class Connector hostedBy_HostSystem ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model ASCLModel -class Connector connectionType ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model ASCLModel -class DataConnection usedBy_Connector ASC -dbfile ../conf/database.properties ./xmetaAdmin.sh addIndex -model DataStageX -class DSDataConnection usedBy_Connector ASC -dbfile ../conf/database.properties - Metadata repository on Oracle
- Run the following
commands:
CREATE INDEX IDX2102100719410 ON investigateDtQltyDmnsn (OFDATAQUALITYCONFIGURATIONXMET ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC, HAS_BENCHMARK_XMETA ASC, IGNORED_XMETA ASC, WEIGHT_XMETA ASC); CREATE INDEX IDX2102100926320 ON issMstrDtFldrfFrmDtFld (DATAFIELD_XMETA ASC); CREATE INDEX IDX2102100927200 ON investigateDatQltyRslt (FROM_EXECUTIONHISTORY_XMETA ASC, NBRECORDSTESTED_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC); CREATE INDEX IDX2102100927180 ON ASCLAnalysisQultyPrblm (FROM_DATAQUALITYRESULT_XMETA ASC, NBOCCURRENCES_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC); CREATE INDEX IDX2102100927560 ON investigateExectnHstry (OF_QUALITYCOMPONENT_XMETA ASC, STATUS_XMETA ASC, STARTTIME_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC, HAS_EXECUTIONRESULT_XMETA ASC); CREATE INDEX IDX2102100928040 ON investigatQltyPrblmTyp (CODE_XMETA ASC, DESCRIPTION_XMETA ASC, NAME_XMETA ASC); CREATE INDEX IDX2102100925570 ON investigateRuleCompnnt (OF_ANALYSISPROJECT_XMETA ASC, SHORTDESCRIPTION_XMETA ASC, NAME_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC); CREATE INDEX IDX2102100734290 ON ASCLAnalysisClassifctn (METHOD_XMETA ASC, STATE_XMETA DESC); CREATE INDEX IDX2102100734240 ON ASCLAnalysisClassifctn (STATE_XMETA ASC, METHOD_XMETA DESC); CREATE INDEX IDX2102100734490 ON investigtClmnnlyssMstr (PROJECTRID_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC, TABLEANALYSISMASTER_XMETA ASC); CREATE INDEX IDX2102100735000 ON ArfFrmrgntdFrmClssfctn (ORIGINATEDFROMCLASSIFICATINXMT ASC); CREATE INDEX IDX2102100735420 ON ASCLAnalysis_DataClass (NAME_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC); CREATE INDEX IDX2102100736020 ON ASCLAnalysisClassifctn (STATE_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC, VALUEFREQUENCY_XMETA ASC); CREATE INDEX IDX2102100735470 ON investigtClmnnlyssMstr (PROJECTRID_XMETA ASC, TABLEANALYSISMASTER_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC); CREATE INDEX IDX2102100738180 ON investigateExectnHstry ("OF_QUALITYCOMPONENT_XMETA" ASC, "STARTTIME_XMETA" DESC); CREATE INDEX IDX2102100736090 ON investigateRuleCompnnt ("OF_ANALYSISPROJECT_XMETA" ASC, "NAME_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" ASC); CREATE INDEX IDX2102100741130 ON investigateDatQltyRslt ("FROM_EXECUTIONHISTORY_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" ASC); CREATE INDEX IDX2102100744410 ON investgtClmnDtTypSmmry (COLUMNANALYSISRESULTS_XMETA ASC, RECORDPERCENT_XMETA ASC, RECORDCOUNT_XMETA ASC, DATATYPE_XMETA ASC); CREATE INDEX "IDX2102100653330" ON investgtClmnnlyssRslts("COLUMNANALYSISMASTER_XMETA" ASC, "RECORDCOUNT_XMETA" DESC); CREATE INDEX "IDX2102100653560" ON ASCLModel_DataFile("HOSTEDBY_HOSTSYSTEM_XMETA" ASC, "PATH_XMETA" ASC, "NAME_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" ASC); CREATE INDEX IDX2102100659300 ON investigtClmnnlyssMstr (TABLEANALYSISMASTER_XMETA ASC, COLUMNPROPERTIES_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA ASC); CREATE INDEX "IDX2102100657460" ON investigateTblPKCnddts ("OF_TABLEANALYSISMASTER_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" ASC, "CANDIDATEFLAG_XMETA" ASC,"SELECTED_XMETA" ASC, "INFERRED_XMETA" ASC); CREATE INDEX "IDX2102100704290" ON investigatTblnlyssMstr ("ANALYSISMASTER_XMETA" ASC, "TABLEANALYSISSTATUS_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" ASC); CREATE INDEX "IDX2102100704280" ON iDtCllctnrfFrmDtCllctn("DATACOLLECTION_XMETA" ASC); CREATE INDEX "IDX2102100706510" ON investigateTblPKCnddts("OF_TABLEANALYSISMASTER_XMETA" ASC, "SELECTED_XMETA" ASC, "REJECTED_XMETA" ASC); CREATE INDEX "IDX2102100709420" ON ASCLModel_Annotation("NOTELABEL_XMETA" ASC, "OF_COMMONOBJECT_XMETA" DESC); CREATE INDEX "IDX2102100708160" ON ASCLRules_RuleVariable("FROM_RULE_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" ASC, "DEFAULT_RULEBINDING_XMETA" ASC); CREATE INDEX "IDX2102100708310" ON investigateRuleCompnnt ("OF_ANALYSISPROJECT_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" DESC); CREATE INDEX "IDX2102100713390" ON investigateAnalyssptns ("OF_ANALYSISSUITE_XMETA" ASC, "USEAUTOMATICDATAQLTYCNFGRTNXMT" DESC); CREATE INDEX "IDX2102100715420" ON investigateTblPKCnddts("SELECTED_XMETA" ASC, "COLUMNANALYSISMASTER_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" ASC); CREATE INDEX "IDX2102100716030" ON investigateKeyComponnt ("OF_TABLEPKCANDIDATE_XMETA" ASC, "USESCOLUMNANALYSISMASTERXMETA" DESC); CREATE INDEX "IDX2102100717450" ON investigateAnalyssptns ("UNIQUENESSTHRESHOLD_XMETA" ASC, "OF_ANALYSISSUITE_XMETA" DESC); CREATE INDEX "IDX2102100717410" ON investigateAnalyssptns("UNIQUENESSTHRESHOLD_XMETA" ASC, "OF_ANALYSISPROJECT_XMETA" DESC); CREATE INDEX "IDX2102100717390" ON investigateAnalyssptns("UNIQUENESSTHRESHOLD_XMETA" ASC, "OF_TABLEANALYSISMASTER_XMETA" DESC); CREATE INDEX "IDX2102100716190" ON investigateAnalyssptns("OF_COLUMNANALYSISMASTER_XMETA" ASC, "OF_TABLEANALYSISMASTER_XMETA" ASC, "OF_ANALYSISPROJECT_XMETA" ASC, "OF_ANALYSISMASTER_XMETA" ASC, "OF_ANALYSISSUITE_XMETA" ASC); CREATE INDEX "IDX2102100733120" ON ASCLRules_RuleBinding ("FROM_RULEEXECUTABLE_XMETA" ASC, "XMETA_REPOS_OBJECT_ID_XMETA" ASC, "BINDS_RULEVARIABLE_XMETA" ASC); CREATE INDEX "IDX2102100730040" ON investigatentgrDstrbtn ("VALUE_XMETA" ASC, "OFRULESETEXECUTIONRESULTXMETA" ASC, "ABSOLUTEFREQUENCY_XMETA" ASC, "FREQUENCY_XMETA" ASC) ; CREATE INDEX IDX2312060847540 ON investigatTblQltynlyss (OF_TABLEANALYSISMASTER_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA DESC); CREATE INDEX IDX2312060848270 ON investigateExectnHstry (OF_QUALITYCOMPONENT_XMETA ASC, ENDTIME_XMETA ASC, STARTTIME_XMETA ASC, HAS_EXECUTIONRESULT_XMETA ASC);
- Complete the following steps depending on where your metadata repository is hosted. These steps
must be completed on the InfoSphere Information Server metadata
repository tier.
- Metadata repository on Db2
-
-
Set the following environment variable:
DB2_INSTANCE_NAME=<db2-instance-name> - Change to the Db2 instance user, set an
environment variables, and connect to the metadata
repository:
su ${DB2_INSTANCE_NAME} . ~/sqllib/db2profile DB2_INSTANCE_NAME=<db2-instance-name> XMETA_SCHEMA_NAME=<xmeta-schema-name> db2 connect to xmeta - Create the
indexes:
db2 "CREATE INDEX ${DB2_INSTANCE_NAME}.IDX2312060847540 ON ${XMETA_SCHEMA_NAME}.INVESTIGATE_TABLEQUALITYANALYSIS ( OF_TABLEANALYSISMASTER_XMETA ASC, XMETA_REPOS_OBJECT_ID_XMETA DESC) ALLOW REVERSE SCANS COLLECT SAMPLED DETAILED STATISTICS" db2 "CREATE INDEX ${DB2_INSTANCE_NAME}.IDX2312060848270 ON ${XMETA_SCHEMA_NAME}.INVESTIGATE_EXECUTIONHISTORY ( OF_QUALITYCOMPONENT_XMETA ASC, ENDTIME_XMETA ASC, STARTTIME_XMETA ASC, HAS_EXECUTIONRESULT_XMETA ASC) ALLOW REVERSE SCANS COLLECT SAMPLED DETAILED STATISTICS" db2 "CREATE UNIQUE INDEX ${DB2_INSTANCE_NAME}.IDX2312060848320 ON ${XMETA_SCHEMA_NAME}.INVESTIGATE_TABLEANALYSISMASTER ( XMETA_REPOS_OBJECT_ID_XMETA ASC ) INCLUDE ( TABLEANALYSISSTATUS_XMETA , ANALYSISMASTER_XMETA ) ALLOW REVERSE SCANS COLLECT SAMPLED DETAILED STATISTICS" db2 "CREATE INDEX ${DB2_INSTANCE_NAME}.IDX2312060848360 ON ${XMETA_SCHEMA_NAME}.INVESTIGATE_TABLEANALYSISMASTER_DATACOLLECTION_REFFROM_DATACOLLECTION (DATACOLLECTION_XMETA ASC) ALLOW REVERSE SCANS COLLECT SAMPLED DETAILED STATISTICS" db2 "CREATE UNIQUE INDEX ${DB2_INSTANCE_NAME}.IDX2312060848530 ON ${XMETA_SCHEMA_NAME}.INVESTIGATE_TABLEANALYSISSTATUS (XMETA_REPOS_OBJECT_ID_XMETA ASC) INCLUDE (DATAQUALITYANALYSISDATE_XMETA, DATAQUALITYANALYSISSTATUS_XMETA) ALLOW REVERSE SCANS COLLECT SAMPLED DETAILED STATISTICS" db2 "COMMIT" - Update the DBMS statistics for all the tables in the metadata repository before you start the
migration.
db2 -x "SELECT 'runstats on table',substr(rtrim(tabschema)||'.'||rtrim(tabname),1,50),' and indexes all;' FROM SYSCAT.TABLES WHERE (type = 'T') AND (tabschema = '${XMETA_SCHEMA_NAME}')" > /tmp/runstats_xmeta.out db2 -tvf /tmp/runstats_xmeta.out - Exit the Db2 instance owner
account:
exit
-
- Metadata repository on Oracle
- Update the DBMS statistics for your metadata repository after the indexes are created with the
assistance of a DBA.
- Log in to the repository tier with
rootcredentials. - Connect to SQL*Plus as the Oracle system
user:
sqlplus xmeta-schema-name/password@oracle_sidIf this command returns Command not found, complete the following steps before proceeding to step 2.c:- Locate the Oracle home directory and add it to the
PATHvariable.export ORACLE_HOME=<oracle home directory> export PATH=$ORACLE_HOME/bin:$PATH - Rerun the command for connecting to SQL*Plus.
- Locate the Oracle home directory and add it to the
- Define a substitution variable XMETA_SCHEMA_NAME. Replace
<xmeta-schema-name> with the schema name of the metadata repository in your
environment:
DEFINE XMETA_SCHEMA_NAME=<xmeta-schema-name> - Run the following
command:
EXEC DBMS_STATS.GATHER_SCHEMA_STATS(ownname => '&&XMETA_SCHEMA_NAME')
Note: It is recommended to update the DBMS statistics for all the tables in the metadata repository before you start the migration. - Log in to the repository tier with
- Create extra indexes for data assets in data quality projects. Follow the instructions in the support document Indices for performance improvement in legacy migration from InfoSphere Information Server to IBM Knowledge Catalog.
Installing the migration toolkit
Install the migration toolkit.
- Download the migration toolkit for InfoSphere Information Server to the
${TOOLKIT_PATH}directory. Follow the instructions on this support page:This document is updated when a new version of the migration toolkit is released and also contains information about any prerequisite patches that you might need to install.
- Set the toolkit version and change to the
${TOOLKIT_PATH}directory.TOOLKIT_VERSION=<toolkit version> cd $TOOLKIT_PATH - Extract the downloaded file to the
${TOOLKIT_PATH}directory.On Linux, run the following command:
tar -zxvf iis-migration-toolkit-${TOOLKIT_VERSION}.tar.gz -C ${TOOLKIT_PATH}On AIX, run the following command:
gunzip -c iis-migration-toolkit-${TOOLKIT_VERSION}.tar.gz | tar -xvf -
Running the initialization script
Run the init_migration_iis.sh script. The script is downloaded as part of the
migration toolkit and can be found in the TOOLKIT_PATH directory.
- Run the script as
rootuser:${TOOLKIT_PATH}/migration/iis/init_migration_iis.sh "$IIS_INSTALL_PATH" - Give the
wkcuser write and execute permission to the${TOOLKIT_PATH}directory. Follow the instructions for your operating system:- Red Hat Enterprise Linux
-
Run the following command:
setfacl -m u:wkc:rwx ${TOOLKIT_PATH} - AIX
-
Complete these steps:
- Set the editor for editing the access control
information.
export EDITOR=/usr/bin/vi - Edit the access control information for the
${TOOLKIT_PATH}directory.
Add the following entry and save the information:acledit ${TOOLKIT_PATH}extended permissions enabled permit rwx u:wkc - Edit the access control information for the
/tmpdirectory.
Add the following entry and save the information:acledit /tmpextended permissions enabled permit rwx u:wkc
- Set the editor for editing the access control
information.
- SUSE Linux
-
Run the following commands:
zypper install acl setfacl -m u:wkc:rwx ${TOOLKIT_PATH} setfacl -m u:wkc:rwx /tmp
- Set the path to the export data directory and grant the
wkcuser write permission to that directory:- Set the
EXPORT_DATA_DIRenvironment variable:EXPORT_DATA_DIR=<path to the export data directory> - Give the
wkcuser write and execute permission to the${EXPORT_DATA_DIR}directory. Follow the instructions for your operating system:- Red Hat Enterprise Linux
-
Run the following command:
setfacl -m u:wkc:rwx ${EXPORT_DATA_DIR} - AIX
-
Complete these steps:
- Set the editor for editing the access control
information.
export EDITOR=/usr/bin/vi - Edit the access control information for the
${EXPORT_DATA_DIR}directory.
Add the following entry and save the information:acledit ${EXPORT_DATA_DIR}extended permissions enabled permit rwx u:wkc
- Set the editor for editing the access control
information.
- SUSE Linux
-
Run the following commands:
setfacl -m u:wkc:rwx ${EXPORT_DATA_DIR}
- Set the
Checking data integrity
To check the integrity of the data in InfoSphere Information Server, run the ISALite tool as
root user.
fieldTask.IS.root is set to the
correct installation path.${TOOLKIT_PATH}/migration/iis_scripts/run_IIS_ISALite.sh ${IIS_INSTALL_PATH}The tool will take some time to process the data and generate a report. This report is stored in the ISA_XMetHC_localhost_EngServ_${timestamp} subdirectory of the current directory.
Open the XMETAHealthChecker.html report file in a browser, and review the
results and instructions that it contains. Verify that no errors occurred and that all the probes
show the status SUCCESS.
Increasing the timeout value for the LTPA token
To ensure that the session can refresh without any issues, increase the timeout value for the LTPA token.
- For WebSphere Application Server Liberty
- Complete these steps:
- Edit the ${IIS_INSTALL_PATH}/wlp/usr/servers/iis/server.xml file.
- Find the
<ltpa expiration="795m"/>entry and update this expiration value with a larger number. Either change it to1440m, which corresponds to 24 hours, or2880m, which is 48 hours. - Restart the application server. Complete the steps that apply to your operating system.
- Stop the application
server:
${IIS_INSTALL_PATH}/ASBServer/bin/MetadataServer.sh stop - Start the application
server:
${IIS_INSTALL_PATH}/ASBServer/bin/MetadataServer.sh run
- Stop the application
server:
- For WebSphere Application Server Network Deployment
- Complete these steps:
- Log in to the administrative console of the WebSphere application server.
- Go to .
- Increase the timeout value. Either change it to
1440m, which corresponds to 24 hours, or2880m, which is 48 hours. - Click Apply, OK, and Save.
- Restart the application server. Complete the instructions that apply to your operating system.
- Stand-alone installation
-
- Stop the application
server:
${IIS_INSTALL_PATH}/ASBServer/bin/MetadataServer.sh stop - Start the application
server:
${IIS_INSTALL_PATH}/ASBServer/bin/MetadataServer.sh run
- Stop the application
server:
- Clustered installation
-
- Stop the cluster as described in WebSphere Application Server Network Deployment: Stopping clusters.
- Start the cluster as described in WebSphere Application Server Network Deployment: Starting clusters.
Installing required software packages
Download and install the IBM Semeru Runtimes package. Follow the instructions for your operating system.
- Red Hat Enterprise Linux
-
Complete these steps:
- Change to the
wkcuser and open a bash shell:su wkc bash - Change to the directory where the toolkit content is
stored:
TOOLKIT_PATH=<toolkit_path> cd $TOOLKIT_PATH - Download and install the IBM Semeru Runtimes OpenJDK 17.Download and install
jdk-17.0.9:curl -LO https://github.com/ibmruntimes/semeru17-binaries/releases/download/jdk-17.0.9%2B9_openj9-0.41.0/ibm-semeru-open-jdk_x64_linux_17.0.9_9_openj9-0.41.0.tar.gztar -zxvf ibm-semeru-open-jdk_x64_linux_17.0.9_9_openj9-0.41.0.tar.gz - Set the path to point to the IBM JDK 17
javainstalled in the previous steps.export PATH=${TOOLKIT_PATH}/jdk-17.0.9+9/bin:${TOOLKIT_PATH}:$PATH
- Change to the
- AIX
-
Complete these steps:
- Change to the
wkcuser:su wkc - Change to the directory where the toolkit content is
stored:
TOOLKIT_PATH=<toolkit_path> cd $TOOLKIT_PATH - Download and install the IBM Semeru Runtimes OpenJDK
17.
curl -LO https://github.com/ibmruntimes/semeru17-binaries/releases/download/jdk-17.0.9%2B9_openj9-0.41.0/ibm-semeru-open-jdk_ppc64_aix_17.0.9_9_openj9-0.41.0.tar.gzgunzip -c ibm-semeru-open-jdk_ppc64_aix_17.0.9_9_openj9-0.41.0.tar.gz | tar -xvf - - Set the path to point to the IBM JDK 17
javainstalled in the previous steps.export PATH=${TOOLKIT_PATH}/jdk-17.0.9+9/bin:${TOOLKIT_PATH}:$PATH
- Change to the
- SUSE Linux
-
Complete these steps:
- Change to the
wkcuser:su wkc - Change to the directory where the toolkit content is
stored:
TOOLKIT_PATH=<toolkit_path> cd $TOOLKIT_PATH - Download and install the IBM Semeru Runtimes OpenJDK
17.
curl -LO https://github.com/ibmruntimes/semeru17-binaries/releases/download/jdk-17.0.9%2B9_openj9-0.41.0/ibm-semeru-open-jdk_x64_linux_17.0.9_9_openj9-0.41.0.tar.gztar -zxvf ibm-semeru-open-jdk_x64_linux_17.0.9_9_openj9-0.41.0.tar.gz - Set the path to point to the IBM JDK 17
javainstalled in the previous steps.export PATH=${TOOLKIT_PATH}/jdk-17.0.9+9/bin:${TOOLKIT_PATH}:$PATH
- Change to the
- Red Hat Linux on System z and SUSE Linux on System z
-
Complete these steps:
- Change to the
wkcuser:su wkc - Change to the directory where the toolkit content is
stored:
TOOLKIT_PATH=<toolkit_path> cd $TOOLKIT_PATH - Download and install the IBM Semeru Runtimes OpenJDK
17.
curl -LO https://github.com/ibmruntimes/semeru17-binaries/releases/download/jdk-17.0.9%2B9_openj9-0.41.0/ibm-semeru-open-jdk_s390x_linux_17.0.9_9_openj9-0.41.0.tar.gztar -zxvf ibm-semeru-open-jdk_s390x_linux_17.0.9_9_openj9-0.41.0.tar.gz - Set the path to point to the IBM JDK 17
javainstalled in the previous steps.export PATH=${TOOLKIT_PATH}/jdk-17.0.9+9/bin:${TOOLKIT_PATH}:$PATH
- Change to the
Creating a db2dsdriver.cfg configuration file
- On the InfoSphere Information Server services tier, run the
following command to list the Db2 connections:
${IIS_INSTALL_PATH}/ASBServer/bin/xmetaAdmin.sh query -expr "select dc.name as connection_name, dc.username as user_name, dc.connectionString as database_name from connector in Connector, dc in connector->uses_DataConnection where connector.name='DB2Connector'" -dbfile ${IIS_INSTALL_PATH}/ASBServer/conf/database.properties http:///5.3/ASCLModel.ecore- Verify the output and check if the Db2 connections are valid and required for migration.
- Proceed to step 2, and the following steps, only if Db2 connections are to be migrated.
- Log in to the engine tier as the Db2 instance user.
- Check if the Db2 client is on the engine tier.
- Create a db2dsdriver.cfg configuration file for the Db2 database on the InfoSphere Information Server engine tier host and make the configuration
file available to the ASBNode agent and the Connector Access Service (CAS).
- Set the following environment
variables:
DB2_INSTANCE_NAME=<db2-instance-name> OUTPUT_FOLDER=<output folder> - Create and populate db2dsdriver.cfg configuration file by running the
following
command:
db2dsdcfgfill -i ${DB2_INSTANCE_NAME} -o ${OUTPUT_FOLDER} - Make sure that read permission to the generated db2dsdriver.cfg file is
granted to the group
Other users. Run the following command:chmod 644 ${OUTPUT_FOLDER}/db2dsdriver.cfg -
Check the content of the generated
db2dsdriver.cfgfile. If you find any local database entries with the settinghost="LOCALHOST"andport="0", replaceLOCALHOSTwith the correct hostname and update theportentry with the correct Db2 port number. Save your changes.For some Db2 versions, running the db2dsdcfgfill command might not create the db2dsdriver.cfg configuration file in the specified folder.
If this error occurs when you run the db2dsdcfgfill command, check your Db2 client version and upgrade to version 11.5.7.0 if necessary. For information about upgrading the Db2 client, see Upgrading your IBM Db2 client instance.
- Make the db2dsdriver.cfg configuration file available to the ASBNode agent
and to CAS. As
rootuser, complete these steps:- Set the following environment variables.
IIS_INSTALL_PATH=<IIS installation path> DB2_INSTANCE_NAME=<db2-instance-name> OUTPUT_FOLDER=<output folder> - Add the following environment variable to the
${IIS_INSTALL_PATH}/ASBNode/bin/NodeAgents_env_DS.sh:
export CC_DB2_CONNECTION_MIGRATION_DB2DSDRIVER_CFG_${DB2_INSTANCE_NAME}=${OUTPUT_FOLDER}/db2dsdriver.cfg - Restart the ASBNode agent by running the following commands. You must have read permission on
the db2dsdriver.cfg configuration
file.
${IIS_INSTALL_PATH}/ASBNode/bin/NodeAgents.sh stop${IIS_INSTALL_PATH}/ASBNode/bin/NodeAgents.sh start
- Set the following environment variables.
- Set the following environment
variables:
If you have multiple Db2 instances, complete these steps for each instance.
Determining the scope of export
Evaluate which data you want to migrate and remove all unnecessary data to avoid cluttering the new deployment.
What to do next
Complete the setup tasks for Cloud Pak for Data in Preparing for migration in IBM Cloud Pak for Data.