Troubleshooting migration from InfoSphere Information Server (IBM Knowledge Catalog)
- Finding the build version of the migration toolkit
- Collecting summary and logs for migration
- Importing data fails due to missing custom attributes
- Rerunning the import in case of a failed or stuck initial run
- Importing data when the export of some entities failed
- Running the export fails due to missing main class
Finding the build version of the migration toolkit
- InfoSphere Information Server
- Run the following command to find out the build version of the migration toolkit you are
using:
cat ${TOOLKIT_PATH}/migration/version.txt - IBM Knowledge Catalog
- Run the following command to find out the build version of the migration toolkit you are
using:
oc get cm legacy-migration-aux-exim-cm -o yaml | grep "buildVer”
Collecting summary and logs for migration
- InfoSphere Information Server
- To generate the archive file that contains the diagnostic logs into the
${TOOLKIT_PATH}folder, complete these steps:- Set the following environment variables and change to the
${TOOLKIT_PATH}directory:TOOLKIT_PATH=<path to the directory where migration-related files are stored)> EXPORT_INSTANCE_NAME=<name of the export instance> EXPORT_DATA_DIR=<path to the export data directory; the wkc user must have write permission to this directory> PREFIX=<prefix of the archive file> cd ${TOOLKIT_PATH} - Generate the archive
file:
${TOOLKIT_PATH}/migration/utils/lm_logs_collector.sh diagnostics ${PREFIX} ${EXPORT_DATA_DIR}/${EXPORT_INSTANCE_NAME}/20*/legacy-migration - Copy the archive file to the
${TOOLKIT_PATH}folder:cp ${EXPORT_DATA_DIR}/${EXPORT_INSTANCE_NAME}/20*/legacy-migration/${PREFIX}_*.zip ${TOOLKIT_PATH}
- Set the following environment variables and change to the
- IBM Knowledge Catalog
- To create the archive file that contains the diagnostic logs, complete these steps:
-
Set the following environment variables:
EXPORT_INSTANCE_NAME=<name of the export instance> NAMESPACE=<namespace in the target CP4D> PREFIX=<prefix of the diagnostic zip file> - Get the name of the import
pod:
LM_IMPORT_POD_NAME=`oc get pods -n ${NAMESPACE} -o custom-columns=POD:.metadata.name | grep cpd-im | xargs -L 1 oc logs -n ${NAMESPACE} | grep "job pods:" | awk -F [ '{print$3}' | awk -F ] '{print$1}'` - Create an archive file that contains the diagnostic logs for the import.
oc debug ${LM_IMPORT_POD_NAME} -n ${NAMESPACE} -- bash -c "/migration/utils/lm_logs_collector.sh diagnostics ${PREFIX}" - Download the archive file by running the following commands:
- Download the archive file that contains the import
summary:
CPD_AUX_POD=`oc get pods -n ${NAMESPACE} -o custom-columns=POD:.metadata.name | grep cpd-aux` oc exec -it $CPD_AUX_POD -n ${NAMESPACE} -- bash -c "mkdir -p /tmp/${EXPORT_INSTANCE_NAME};cp /data/cpd/data/exports/${NAMESPACE}/${EXPORT_INSTANCE_NAME}/20*/legacy-migration/${PREFIX}_*.zip /tmp/${EXPORT_INSTANCE_NAME}/." - Copy the archive to the current
directory:
oc cp ${NAMESPACE}/$CPD_AUX_POD:tmp/${EXPORT_INSTANCE_NAME}/. .
- Download the archive file that contains the import
summary:
-
Importing data fails due to missing custom attributes
The import of data assets can fail if the data assets have custom attributes attached where the attribute definitions were created in InfoSphere Information Server version 11.5 Rollup Patch 3 or earlier.
To fix the issue, follow the instructions in the InfoSphere Information Governance Catalog: Custom attributes do not function as expected support document. Then, rerun both the export and the import.
Rerunning the import in case of a failed or stuck initial run
If the initial import failed or got stuck, you can rerun the import. By default, only the failed or skipped metadata of asset types are imported on rerun. You can choose to reimport a selected subset of the failed or skipped metadata or to reimport all assets.
CPD_AUX_POD=`oc get pods -n ${NAMESPACE} -o custom-columns=POD:.metadata.name | grep cpd-aux`
oc exec -it ${CPD_AUX_POD} -- bash -l -c "cat /data/cpd/data/exports/${NAMESPACE}/${EXPORT_INSTANCE_NAME}/20*/legacy-migration/import-status.json"- The import status is
failed.You can review the import-failed-summary.json, which contains a detailed failure report. To access the report, follow the instructions in Verification steps on the Cloud Pak for Data system.
- The import status is
runningwith the message Validation in progress.The import is still in progress. No action is required.
- The import status is
runningwith the message In progress.The import might be stuck. To confirm that the import is stuck, complete these steps:- In the UI, check whether projects without content are created. In this case, the import is stuck.
- Also in the UI, check whether new assets are added to the project that was created last or to the target catalog. If no new assets are created, the import is stuck.
For a failed or stuck import, complete these steps:
- Collect diagnostic logs as described in Collecting summary and logs for migration.
- If an import fails or gets stuck, not all background processes might be stopped. To avoid that
such background processes cause failures when you rerun the import, stop any running background processes:
- Delete the import job by running the following command. None of the already imported assets in
the catalog or any of the projects are deleted. For proper cleanup, wait for a few minutes before
you proceed.
cpd-cli export-import import delete --profile=${PROFILE_NAME} ${IMPORT_INSTANCE_NAME} - Stop any running
legacy-migration-importpods.- To find the running pods, run the following
command:
oc get pods -n ${NAMESPACE} | grep Running | grep legacy-migration-import-job - To get the job name, remove the last 5 characters and the hyphen from the returned pod name. For
example, if the pod name is
legacy-migration-import-job-vv8fd-q8msw, the job name islegacy-migration-import-job-vv8fd. - Delete the job to stop the running pod. Replace <xxxxx> as
appropriate.
oc delete job -n ${NAMESPACE} legacy-migration-import-job-<xxxxx>
- To find the running pods, run the following
command:
- Stop any running
catalog-api-importpods.- To find the running pods, run the following
command:
oc get pods -n ${NAMESPACE} | grep Running | grep catalog-api-import-job - To get the job name, remove the last 5 characters and the hyphen from the returned pod name. For
example, if the pod name is
catalog-api-import-job-vv8fd-q8msw, the job name iscatalog-api-import-job-vv8fd. - Delete the job to stop the running pod. Replace <xxxxx> as
appropriate.
oc delete job -n ${NAMESPACE} catalog-api-import-job-<xxxxx>
- To find the running pods, run the following
command:
- Delete the import job by running the following command. None of the already imported assets in
the catalog or any of the projects are deleted. For proper cleanup, wait for a few minutes before
you proceed.
- Rerun the import.
- The cause for the import issue was fixed on the import side.
- For example, you increased the space in any IBM
Knowledge Catalog services or applied a patch for the import
command.
Rerun the import by following the instructions in Importing the data to the target system. You can choose to reimport a subset of the failed data archives or to reimport all data. See Reimport a subset of the failed data archives and Reimport all data.
Only data archives that have failed assets from the initial import attempt are reimported. These data archives are listed in the import-failed-summary.json file that is created during the initial import.
- The cause for the import issue was fixed on the export side.
- For example, you applied a patch for the export command.Complete these steps:
- Rerun the export in InfoSphere Information Server with a
different
EXPORT_INSTANCE_NAMEvalue. Follow the instructions in Exporting data from the InfoSphere Information Server system. - Transfer the exported data to Cloud Pak for Data.
- Rerun the import with an updated import_params.yaml file as described in Importing the data to the target system. You can choose to reimport a subset of the failed data archives or to reimport all data. See Reimport a subset of the failed data archives and Reimport all data.
- Rerun the export in InfoSphere Information Server with a
different
- Reimport a subset of the failed data archives
- To reimport only a subset of the failed data archives, edit the
import_failed_summary.json file. Add
"skip_import": "true"for any archive that you want to skip as shown in this example:{ "plugin_name": "data-quality", "entity_type": "projects", "zip_file_path": "data-quality/projects/data/projects/project.6c1d9fb6-19a6-3af8-b774-ef75bd48b7f1.zip", "failed_count": "1", "wkc_project_id": "3b49cd13-a4f0-4f6c-bfcb-79d71a1d1c3d", "wkc_project_name": "EditScenariosTest", "iis_project_rid": "ec1481df.64b1b87d.1oropaet6.p704l8e.tnsj0h.al57ma9r2eqrlfkes9co2", "iis_project_name": "EditScenariosTest", "skip_import": "true" }Add the following parameter to the import_params.yaml file:
You can obtain the content of the import_failed_summary.json file in the required format with the following command:REIMPORT_FILTER: 'Content of the import_failed_summary.json file on a single line'cat import-failed-summary.json | jq -cThe options
REIMPORT_FILTERandREIMPORT_STRATEGYare mutually exclusive. - Reimport all data
- Add the following parameter to the import_params.yaml
file:
REIMPORT_STRATEGY: allThe options
REIMPORT_STRATEGYandREIMPORT_FILTERare mutually exclusive.
Importing data when the export of some entities failed
- To create the export archive with the partially exported data, run the following
command:
${TOOLKIT_PATH}/migration/iis_scripts/create_archive_force_import.sh -dir ${EXPORT_DATA_DIR} -export_instance_name ${EXPORT_INSTANCE_NAME} -namespace ${NAMESPACE} - Transfer the data to the Cloud Pak for Data system as described in Transferring the exported data.
Running the export fails due to missing main class
The export fails with the error message Error: Could not find or load main class. This error occurs if you extracted the iis-migration-toolkit-<version>.tar.gz file multiple times into the same directory.
The iis-migration-toolkit-<version>.tar.gz file must be extracted to an empty folder.