Configuring and validating the Open Data for Industries environment

After you install the Open Data for Industries service, you need to configure and validate your environment through build verification test (BVT).

Before you begin

Prerequisites

  1. If the Utility Installer Container is not running, you need to pull and run the installer image. Use the following command with Podman runtime.
    podman run -it --rm icr.io/odi-utility-tool/odi-ansible-installer:stable302
  2. Validate that the values of the odi-vars.yml match the values that you use during the software utilities install process. The file is located in the playbooks/vars folder of the running container.
    Note: In case of verified certificate, set storage_class to ibmc-block-gold and use_default_ca to false. For not-verified or self-signed certificate, set storage_class to ocs-storagecluster-ceph-rbd and use_default_ca to true.
    Variable Type Required Default value Description
    project_name String Yes Osdu-tenant The Red Hat® OpenShift® project where the utilities are provisioned.
    storage_class String Yes ibmc-block-gold Class of storage that abstracts underlying storage provider. For more information, see Storage requirements.
    block_storage_class String Yes ibmc-block-bronze The block storage class. For more information, see Storage requirements.
    file_storage_class String Yes ibmc-file-gold The file storage class. For more information, see Storage requirements.
    use_default_ca Boolean string Yes false If set to "true" indicates that the Open Data for Industries endpoints use an internal CA.

Configuration procedure

  1. Log in to the target Red Hat OpenShift cluster. Use the command line of the running container and run the oc command.
    oc login --token={user_token_Openshift_Webconsole} --server=https://{server_API_url}
  2. Use the command line of the running container to run the Ansible® playbook that configures the IBM Open Data for Industries utilities. Make sure that the command line is at path /playbooks in the running container.
    ansible-playbook odi-install/post-install.yml --extra-vars="@vars/odi-vars.yaml"
  3. Check whether the default Open Data for Industries schemas are ingested.
    1. Get CouchDB credentials.
    2. Get the URL to log in to CouchDB web console.
      Note: Replace the project_name with the project where the Open Data for Industries is installed.
       oc get routes -n {{project_name}} | egrep -ai couch | awk '{print $2}'
    3. To get to the CouchDB database console:
      1. Add https:// at the beginning of the URL that you obtained.
      2. Add _utils to the end of the URL.
    4. Locate the oc-cpd-dataecosystem-opendes-schema2 database. It contains the Open Data for Industries schema documents.
  4. Validate that the Apache Airflow utility is installed and configured.
    1. Get the Airflow URL.
      oc get routes -n {{project_name}} | egrep -ai airflow-web | awk '{print $2}'
    2. Get the Apache Airflow console URL by adding https:// to the path you already obtained.
    3. Get the configured default credentials (username and password) for Apache airflow.
      oc get secrets props-secret -n {{project_name}} -o jsonpath="{.data.ibm\.username4airflow}"|base64 -d
      oc get secrets props-secret -n {{project_name}} -o jsonpath="{.data.ibm\.password4airflow}"|base64 -d
    4. Log in to the Apache Airflow console by using the credentials and validate that the Apache Airflow variables are all set.
  5. To check the rest of the available utility services URLs, use the following commands:
    Red Hat AMQ Broker
    oc get routes -n {{project_name}} | egrep -ai amq | awk '{print $2}'
    Keycloak
    oc get routes -n {{project_name}} | egrep -ai keycloak | awk '{print $2}'
    MinIO
    oc get routes -n {{project_name}} | egrep -ai minio | awk '{print $2}'
    Elasticsearch
    oc get routes -n {{project_name}} | egrep -ai elastic | awk '{print $2}'
    Cloud Pak for Data control plane
    oc get routes -n {{project_name}} | egrep -ai cpd | awk '{print $2}'

Validation procedure

To validate the successful installation and configuration of the Open Data for Industries core services, you can run:

Preliminary regression testing
Use it to quickly check the minimum configurations that are required by the core services.
Note: Make sure that the Utility Installer Container is running and that the variables are configured as explained in the Installing software utilities procedure.
  1. Log in to the target Red Hat OpenShift cluster. Use the command line of the running container and run the oc command.
    oc login --token={user_token_Openshift_Webconsole} --server=https://{server_API_url}
  2. Use the command line of the running container to run the Ansible playbook for regression testing. Make sure that the command line is at path /playbooks in the running container.
    ansible-playbook odi-install/sanity-test.yml --extra-vars="@vars/odi-vars.yaml"

As a result, the execution log shows the test results.

Preliminary functional smoke test
Use this API test suite to validate the interaction between the Open Data for Industries core services and the utilities.
Note: Make sure that the Utility Installer Container is running and that the variables are configured as explained in the Installing software utilities procedure.
  1. Log in to the target Red Hat OpenShift cluster. Use the command line of the running container and run the oc command.
    oc login --token={user_token_Openshift_Webconsole} --server=https://{server_API_url}
  2. Use the command line of the running container to run the Ansible playbook for preliminary functional smoke test. Make sure that the command line is at path /playbooks in the running container.
    ansible-playbook odi-install/smoke-test.yml --extra-vars="@vars/odi-vars.yaml"

As a result, the execution log shows the test results.