Installing watsonx Orchestrate

An instance administrator can install watsonx Orchestrate on IBM® Software Hub Version 5.2.

Who needs to complete this task?

Instance administrator To install watsonx Orchestrate, you must be an instance administrator. An instance administrator has permission to install software in the following projects:

The operators project for the instance

The operators for this instance of watsonx Orchestrate are installed in the operators project.

In the installation commands, the ${PROJECT_CPD_INST_OPERATORS} environment variable refers to the operators project.

The operands project for the instance

The custom resources for the control plane and watsonx Orchestrate are installed in the operands project.

In the installation commands, the ${PROJECT_CPD_INST_OPERANDS} environment variable refers to the operands project.

When do you need to complete this task?

Review the following options to determine whether you need to complete this task:

  • If you want to install multiple services at the same time, follow the process in Running a batch installation of solutions and services instead.
  • If you didn't install watsonx Orchestrate as part of a batch installation, complete this task to add watsonx Orchestrate to your environment.

    Repeat as needed If you are responsible for multiple instances of IBM Software Hub, you can repeat this task to install more instances of watsonx Orchestrate on the cluster.

Important: You cannot install watsonx Orchestrate and standalone watsonx Assistant on the same instance of IBM Software Hub. If you want to install these services on the same Red Hat® OpenShift® Container Platform cluster, you must install each service on a different instance of IBM Software Hub. For more information see Multitenancy considerations.

Information you need to complete this task

Review the following information before you install watsonx Orchestrate:

Version requirements

All of the components that are associated with an instance of IBM Software Hub must be installed at the same release. For example, if the IBM Software Hub control plane is installed at Version 5.2.2, you must install watsonx Orchestrate at Version 5.2.2.

Environment variables

The commands in this task use environment variables so that you can run the commands exactly as written.

  • If you don't have the script that defines the environment variables, see Setting up installation environment variables.
  • To use the environment variables from the script, you must source the environment variables before you run the commands in this task. For example, run:
    source ./cpd_vars.sh
Security context constraint

watsonx Orchestrate works with the default Red Hat OpenShift Container Platform security context constraint, restricted-v2.

Storage requirements
You must specify storage classes when you install watsonx Orchestrate. The following storage classes are recommended. However, if you don't use these storage classes on your cluster, ensure that you specify a storage class with an equivalent definition.
Storage Notes Storage classes
OpenShift Data Foundation When you install the service, specify file storage and block storage.
  • File storage: ocs-storagecluster-cephfs
  • Block storage: ocs-storagecluster-ceph-rbd
IBM Fusion Data Foundation When you install the service, specify file storage and block storage.
  • File storage: ocs-storagecluster-cephfs
  • Block storage: ocs-storagecluster-ceph-rbd
IBM Fusion Global Data Platform When you install the service, specify the same storage class for both file storage and block storage.
  • File storage:

    Either of the following storage classes:

    • ibm-spectrum-scale-sc
    • ibm-storage-fusion-cp-sc
  • Block storage:

    Either of the following storage classes:

    • ibm-spectrum-scale-sc
    • ibm-storage-fusion-cp-sc
IBM Storage Scale Container Native Not supported. Not applicable.
Portworx When you install the service, the --storage_vendor=portworx option ensures that the service uses the correct storage classes.
  • File storage: portworx-shared-gp3
  • Block storage: portworx-db-gp3-sc
NFS Not supported. Not applicable.
Amazon Elastic storage Not supported. Not applicable.
NetApp Trident When you install the service, specify the same storage class for both file storage and block storage.
  • File storage: ontap-nas
  • Block storage: ontap-nas
Nutanix Not supported. Not applicable.
Note:
  • 5.2.0 5.2.1 By default, ibm-slate-30m-english-rtrvr is enabled during watsonx Orchestrate installation. To avoid installation failures when installing from a private registry, ensure that this specific model is also mirrored during the mirroring process. Use the following values during the mirroring process:

    Mirror Watsonx AI model : ibm-slate-30m-english-rtrvr

    IMAGE_GROUPS: ibmwxSlate30mEnglishRtrvr

  • Starting from Version 5.2.2, if you choose to configure external models through AI Gateway, installing watsonx.ai™ and Red Hat OpenShift AI becomes optional.
  • Starting from Version 5.2.2, embedding model can be configured using AI Gateway. For more information, see Registering the embedding model through AI Gateway.

Before you begin

This task assumes that the following prerequisites are met:

System requirements
This task assumes that the cluster meets the minimum requirements for watsonx Orchestrate.
Where to find more information
If this task is not complete, see System requirements.
Workstation
This task assumes that the workstation from which you will run the installation is set up as a client workstation and has the following command-line interfaces:
  • IBM Software Hub CLI: cpd-cli
  • OpenShift CLI: oc
Where to find more information
If this task is not complete, see Setting up a client workstation.
Control plane
This task assumes that the IBM Software Hub control plane is installed.
Where to find more information
If this task is not complete, see Installing an instance of IBM Software Hub.
Private container registry
If your environment uses a private container registry (for example, your cluster is air-gapped), this task assumes that the following tasks are complete:
  1. The watsonx Orchestrate software images are mirrored to the private container registry.
    Where to find more information
    If this task is not complete, see Mirroring images to a private container registry.
  2. The cpd-cli is configured to pull the olm-utils-v3 image from the private container registry.
    Where to find more information
    If this task is not complete, see Pulling the olm-utils-v3 image from the private container registry.
GPU operators
If you plan to use features that require GPUs, this task assumes that the operators required to use GPUs are installed.
Where to find more information
If this task is not complete, see Installing operators for services that require GPUs.
Red Hat OpenShift AI
If you plan to use features that require Red Hat OpenShift AI, this task assumes that Red Hat OpenShift AI is installed.
Where to find more information
If this task is not complete, see Installing Red Hat OpenShift AI.
Multicloud Object Gateway
This task assumes that the following tasks are complete:
  1. Multicloud Object Gateway is installed and configured.
    Where to find more information
    If this task is not complete, see Installing Multicloud Object Gateway.
  2. The secrets that enable watsonx Orchestrate to connect to Multicloud Object Gateway exist.
    Where to find more information
    If this task is not complete, see Creating secrets for services that use Multicloud Object Gateway.
Red Hat OpenShift Serverless Knative Eventing
This task assumes that Red Hat OpenShift Serverless Knative Eventing is installed.
Where to find more information
If this task is not complete, see Installing Red Hat OpenShift Serverless Knative Eventing.

Preparing to install

Before you install watsonx Orchestrate, do the following:
Verify the Identity and Access Management (IAM)
Verify that the Identity and Access Management (IAM) Service is enabled. By default, the IAM service is enabled with the IBM Software Hub Version 5.2.0. For more information, see Integrating with the Identity Management Service.
GPU requirements for the models
For details on the GPU requirements for the models, see GPU requirements for models.
Other model requirements
Starting from Version 5.2.2, you must configure at-least one LLM model and one embedding model in each service-instance of watsonx Orchestrate.

Procedure

Complete the following tasks to install watsonx Orchestrate:

  1. Specifying installation options
  2. Starting the IBM Events Operator in the operators project
  3. Installing the service
  4. Validating the installation
  5. What to do next

Specifying installation options

If you plan to install watsonx™ Orchestrate configuration, specify the appropriate installation options in a file named install-options.yml in the cpd-cli work directory (For example: cpd-cli-workspace/olm-utils-workspace/work).

The installation options that you specify depend on several factors. The first factor that you must consider is where you want to install the foundation models for watsonx Orchestrate. You can use foundation models on:
The same cluster as watsonx Orchestrate
Choosing a model GPU requirements
You must use one of the models provided by IBM.

The features that you plan to use determine the model or models that you must install.

You must have sufficient GPU on the cluster where you plan to install watsonx Orchestrate.
A remote or external cluster by using AI gateway
Choosing a model GPU requirements
You can choose whether to use:
  • One of the models provided by IBM

    If you use the models provided by IBM, the features that you plan to use determine the models that you must install.

  • A custom model

    If you use a custom model, you must register the external model through AI gateway.

Local GPU is not required.
Remote GPU might be required:
  • If you plan to host models on a remote cluster, you must have sufficient GPU on the cluster where you plan to install the foundation models.

    For more information on GPU requirements, consult the documentation from the model provider.

  • If you plan to use models hosted by a third-party, you don't need GPU.
After you decide where you will install the foundation models, you can decide which features you want to install. You can install:
  • Only the agentic AI features
  • The agentic AI features and legacy features, such as conversational search and conversational skills.
Models provided by IBM

Review the following table to determine which model or models provide the features that you need:

Model
Agentic AI

Domain agents
Agentic AI

Tool and API orchestration
Conversational search

Answer generation
Conversational search

Query rewrite
Conversational skills

Custom actions information gathering
granite-3-8b-instruct No No Yes No No
ibm-granite-8b-unified-api-model-v2 No No No Yes Yes
llama-3-1-70b-instruct No Yes Yes Yes Yes
llama-3-2-90b-vision-instruct Yes Yes Yes Yes Yes
Important: The llama-3-2-90b-vision-instruct model is recommended over the llama-3-1-70b-instruct model. The llama-3-2-90b-vision-instruct model offers:
  • Better performance
  • More accurate results
Private container registry users: You must mirror the images for the models that you plan to use to the private container registry. For more information, see Determining which models to mirror to your private container registry.

Choose the appropriate YAML based on where you plan to install the foundation models:

Install foundation models on the same cluster as watsonx Orchestrate
Choose the appropriate YAML based on the features that you want to install:
Agentic features only
Choose the appropriate YAML based on the model that you want to install.
  • To install the llama-3-1-70b-instruct model, choose the appropriate YAML based on the version of IBM Software Hub you installed:
    Version 5.2.0 or Version 5.2.1
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_install_mode: lite
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_ootb_models:
      - llama-3-1-70b-instruct
      - ibm-slate-30m-english-rtrvr
    Version 5.2.2
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_ootb_models:
      - llama-3-1-70b-instruct
      - ibm-slate-30m-english-rtrvr
  • To install the llama-3-2-90b-vision-instruct model, choose the appropriate YAML file based on the version of IBM Software Hub you installed:
    Version 5.2.0 or Version 5.2.1
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_install_mode: lite
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_ootb_models:
      - llama-3-2-90b-vision-instruct
      - ibm-slate-30m-english-rtrvr
    Version 5.2.2
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_ootb_models:
      - llama-3-2-90b-vision-instruct
      - ibm-slate-30m-english-rtrvr
Legacy features and agentic AI features
The parameters that you specify depend on the models that you want to install:
  • To install the granite-3-8b-instruct model, choose the appropriate YAML based on the version of IBM Software Hub you installed:
    Version 5.2.0 or Version 5.2.1
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_ootb_models:
      - granite-3-8b-instruct
      - ibm-slate-30m-english-rtrvr
    Version 5.2.2
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_install_mode: agentic_skills_assistant
    watson_orchestrate_ootb_models:
      - granite-3-8b-instruct
      - ibm-slate-30m-english-rtrvr
  • To install the ibm-granite-8b-unified-api-model-v2 model, choose the appropriate YAML based on the version of IBM Software Hub you installed:
    Version 5.2.0 or Version 5.2.1
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_syom_models:
      - ibm-granite-8b-unified-api-model-v2
    watson_orchestrate_ootb_models:
      - ibm-slate-30m-english-rtrvr
    Version 5.2.2
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_install_mode: agentic_skills_assistant
    watson_orchestrate_syom_models:
      - ibm-granite-8b-unified-api-model-v2
    watson_orchestrate_ootb_models:
      - ibm-slate-30m-english-rtrvr
  • To install the llama-3-1-70b-instruct model, choose the appropriate YAML based on the version of IBM Software Hub you installed:
    Version 5.2.0 or Version 5.2.1
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_ootb_models:
      - llama-3-1-70b-instruct
      - ibm-slate-30m-english-rtrvr
    Version 5.2.2
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_install_mode: agentic_skills_assistant
    watson_orchestrate_ootb_models:
      - llama-3-1-70b-instruct
      - ibm-slate-30m-english-rtrvr
  • To install the llama-3-2-90b-vision-instruct model, choose the appropriate YAML based on the version of IBM Software Hub you installed:
    Version 5.2.0 or Version 5.2.1
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_ootb_models:
      - llama-3-2-90b-vision-instruct
      - ibm-slate-30m-english-rtrvr
    Version 5.2.2
    ################################################################################
    # watsonx Orchestrate parameters
    ################################################################################
    watson_orchestrate_watsonx_ai_type: true
    watson_orchestrate_install_mode: agentic_skills_assistant
    watson_orchestrate_ootb_models:
      - llama-3-2-90b-vision-instruct
      - ibm-slate-30m-english-rtrvr
Using foundation models on a remote or external cluster by using AI gateway
Important: If you choose this option, you must register the models through AI gateway after you install watsonx Orchestrate.

Choose the appropriate YAML based on the features that you want to install:

Agentic features only
Choose the appropriate YAML based on the version of IBM Software Hub you installed:
Version 5.2.0 or Version 5.2.1
################################################################################
# watsonx Orchestrate parameters
################################################################################
watson_orchestrate_install_mode: lite

The slate-30m-english-rtrvr model, which does not require GPU, is automatically installed when you install watsonx Orchestrate.

Version 5.2.2
################################################################################
# watsonx Orchestrate parameters
################################################################################
watson_orchestrate_watsonx_ai_type: false
Legacy features and agentic AI features
Choose the appropriate YAML based on the version of IBM Software Hub you installed:
Version 5.2.0 or Version 5.2.1
################################################################################
# watsonx Orchestrate parameters
################################################################################
watson_orchestrate_watsonx_ai_type: false

The slate-30m-english-rtrvr model, which does not require GPU, is automatically installed when you install watsonx Orchestrate.

Version 5.2.2
################################################################################
# watsonx Orchestrate parameters
################################################################################
watson_orchestrate_watsonx_ai_type: false
watson_orchestrate_install_mode: agentic_skills_assistant
Property Description
watson_orchestrate_watsonx_ai_type

Specify whether to install Inference foundation models (watsonx_ai_ifm) based on where you plan to install the foundation models.

You can install the foundation models on:
  • The same cluster as watsonx Orchestrate

    In this situation, you must install Inference foundation models.

    If you choose this option, you must have sufficient GPU on the cluster where you plan to install watsonx Orchestrate.

  • A remote cluster

    If you choose this option, you must have sufficient GPU on the cluster where you plan to install the foundation models.

    After you install watsonx Orchestrate, you must use the AI gateway to connect to the foundation models on the remote cluster.

Default value
true

If you omit this option, the default value is used.

Valid values
false
Do not install Inference foundation models.

Specify this option if you plan to install the foundation models on a remote cluster.

true
Install the Inference foundation models.

Specify this option if you plan to install the foundation models on the cluster where you plan to install watsonx Orchestrate.

watson_orchestrate_install_mode

Specify the features that you plan to install.

The usage of this option depends on the version of IBM Software Hub you installed:

  • 5.2.0 5.2.1 If you plan to use only agentic AI features in watsonx Orchestrate, use the watson_orchestrate_install_mode parameter to install only the features that are needed for agentic AI.

    If you plan to use conversational search or conversational skills, do not specify this parameter.

  • 5.2.2 If you plan to use only agentic AI features in watsonx Orchestrate, do not specify this parameter.
  • If you plan to use conversational search or conversational skills, use the watson_orchestrate_install_mode to install agentic AI features, conversational search, and conversational skills.
Valid values
lite
5.2.0 5.2.1 Specify this option only if both of the following statements are true:
  • IBM Software Hub Version 5.2.0 or Version 5.2.1 is installed
  • You want to install only the agentic AI features
If you set watson_orchestrate_watsonx_ai_type: true to install the foundation models on the same cluster as watsonx Orchestrate, specify which model you want to install:
llama-3-2-90b-vision-instruct
To use the llama-3-2-90b-vision-instruct model (ID: llama-3-2-90b-vision-instruct), specify:
watson_orchestrate_install_mode: lite
watson_orchestrate_ootb_models:
  - llama-3-2-90b-vision-instruct
  - ibm-slate-30m-english-rtrvr
llama-3-1-70b-instruct
To use the llama-3-1-70b-instruct model (ID: llama-3-1-70b-instruct), specify:
watson_orchestrate_install_mode: lite
watson_orchestrate_ootb_models:
  - llama-3-1-70b-instruct
  - ibm-slate-30m-english-rtrvr
agentic_skills_assistant
5.2.2 Specify this option only if both of the following statements are true:
  • IBM Software Hub Version 5.2.2 is installed
  • You want to install legacy features, such as conversational search or conversational skills.
If you set watson_orchestrate_watsonx_ai_type: true to install the foundation models on the same cluster as watsonx Orchestrate, specify which model you want to install:
granite-3-8b-instruct
To use the granite-3-8b-instruct model (ID: granite-3-8b-instruct), specify:
watson_orchestrate_watsonx_ai_type: true
watson_orchestrate_install_mode: agentic_skills_assistant
watson_orchestrate_ootb_models:
  - granite-3-8b-instruct
  - ibm-slate-30m-english-rtrvr
ibm-granite-8b-unified-api-model-v2
To use the ibm-granite-8b-unified-api-model-v2 model (ID: ibm-granite-8b-unified-api-model-v2), specify:
watson_orchestrate_watsonx_ai_type: true
watson_orchestrate_install_mode: agentic_skills_assistant
watson_orchestrate_syom_models:
  - ibm-granite-8b-unified-api-model-v2
watson_orchestrate_ootb_models:
  - ibm-slate-30m-english-rtrvr
llama-3-2-90b-vision-instruct
To use the llama-3-2-90b-vision-instruct model (ID: llama-3-2-90b-vision-instruct), specify:
watson_orchestrate_install_mode: lite
watson_orchestrate_install_mode: agentic_skills_assistant
watson_orchestrate_ootb_models:
  - llama-3-2-90b-vision-instruct
  - ibm-slate-30m-english-rtrvr
llama-3-1-70b-instruct
To use the llama-3-1-70b-instruct model (ID: llama-3-1-70b-instruct), specify:
watson_orchestrate_install_mode: lite
watson_orchestrate_install_mode: agentic_skills_assistant
watson_orchestrate_ootb_models:
  - llama-3-1-70b-instruct
  - ibm-slate-30m-english-rtrvr
watson_orchestrate_ootb_models

This option is valid only if you install the foundation models on the same cluster as watsonx Orchestrate (watson_orchestrate_watsonx_ai_type: true).

Specify whether to install one or more general models.

Install models based on the features that you want to enable.

Default value
[]
Valid values
[]
Do not install a general model.
granite-3-8b-instruct
Install the granite-3-8b-instruct model (ID: granite-3-8b-instruct).

To install this model, include the following parameters in your install-options.yml file:

watson_orchestrate_ootb_models:
  - granite-3-8b-instruct
  - ibm-slate-30m-english-rtrvr
llama-3-1-70b-instruct
Install the llama-3-1-70b-instruct model (ID: llama-3-1-70b-instruct).

To install this model, include the following parameters in your install-options.yml file:

watson_orchestrate_watsonx_ai_type: true
watson_orchestrate_ootb_models:
  - llama-3-1-70b-instruct
  - ibm-slate-30m-english-rtrvr
llama-3-2-90b-vision-instruct
Install the llama-3-2-90b-vision-instruct model (ID: llama-3-2-90b-vision-instruct).

To install this model, include the following parameters in your install-options.yml file:

watson_orchestrate_watsonx_ai_type: true
watson_orchestrate_ootb_models:
  - llama-3-2-90b-vision-instruct
  - ibm-slate-30m-english-rtrvr
Installing multiple general models
If you want to install more than one general model, add the ID of each model that you want to install as a list item. For example:
watson_orchestrate_watsonx_ai_type: true
watson_orchestrate_ootb_models:
  - llama-3-1-70b-instruct
  - ibm-slate-30m-english-rtrvr
watson_orchestrate_syom_models

This option is valid only if:

  • You install the foundation models on the same cluster as watsonx Orchestrate (watson_orchestrate_watsonx_ai_type: true)
  • You install the legacy features
    Restriction: Do not specify this parameter if you specify watson_orchestrate_install_mode: lite.

Specify whether to install a specialized model that is specifically tuned for use with watsonx Orchestrate.

Default value
[]
Valid values
[]
Do not install a specialized model.
ibm-granite-8b-unified-api-model-v2
Install the ibm-granite-8b-unified-api-model-v2 model (ID: ibm-granite-8b-unified-api-model-v2).

To install this model, include the following parameters in your install-options.yml file:

watson_orchestrate_watsonx_ai_type: true
watson_orchestrate_syom_models:
  - ibm-granite-8b-unified-api-model-v2

Starting the IBM Events Operator in the operators project

5.2.05.2.1

If you installed only the agentic features of watsonx Orchestrate by setting watson_orchestrate_install_mode: lite , you must create an operand request in the operands project to start the IBM Events Operator(ibm-events-operator) in the operators project.

To start the IBM Events Operator, create the following operand request:
cat <<EOF |oc apply -f -
apiVersion: operator.ibm.com/v1alpha1
kind: OperandRequest
metadata:
  name: wo-operand-request
  namespace: ${PROJECT_CPD_INST_OPERANDS}
spec:
  requests:
  - operands:
    - name: ibm-events-operator-v5.1
    registry: common-service
EOF

Installing the service

To install watsonx Orchestrate:

  1. Log the cpd-cli in to the Red Hat OpenShift Container Platform cluster:
    ${CPDM_OC_LOGIN}
    Remember: CPDM_OC_LOGIN is an alias for the cpd-cli manage login-to-ocp command.
  2. Run the following command to create the required OLM objects for watsonx Orchestrate in the operators project for the instance:
    cpd-cli manage apply-olm \
    --release=${VERSION} \
    --cpd_operator_ns=${PROJECT_CPD_INST_OPERATORS} \
    --components=watsonx_orchestrate
    Wait for the cpd-cli to return the following message before you proceed to the next step:
    [SUCCESS]... The apply-olm command ran successfully

    If the apply-olm fails, see Troubleshooting the apply-olm command during installation or upgrade.

  3. Create the custom resource for watsonx Orchestrate.

    The command that you run depends on the storage on your cluster.


    Red Hat OpenShift Data Foundation storage

    Run the appropriate command to create the custom resource.

    Default installation (without installation options)
    cpd-cli manage apply-cr \
    --components=watsonx_orchestrate \
    --release=${VERSION} \
    --cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
    --block_storage_class=${STG_CLASS_BLOCK} \
    --file_storage_class=${STG_CLASS_FILE} \
    --license_acceptance=true
    Custom installation (with installation options)
    cpd-cli manage apply-cr \
    --components=watsonx_orchestrate \
    --release=${VERSION} \
    --cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
    --block_storage_class=${STG_CLASS_BLOCK} \
    --file_storage_class=${STG_CLASS_FILE} \
    --param-file=/tmp/work/install-options.yml \
    --license_acceptance=true

    IBM Fusion Data Foundation storage

    Run the appropriate command to create the custom resource.

    Default installation (without installation options)
    cpd-cli manage apply-cr \
    --components=watsonx_orchestrate \
    --release=${VERSION} \
    --cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
    --block_storage_class=${STG_CLASS_BLOCK} \
    --file_storage_class=${STG_CLASS_FILE} \
    --license_acceptance=true
    Custom installation (with installation options)
    cpd-cli manage apply-cr \
    --components=watsonx_orchestrate \
    --release=${VERSION} \
    --cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
    --block_storage_class=${STG_CLASS_BLOCK} \
    --file_storage_class=${STG_CLASS_FILE} \
    --param-file=/tmp/work/install-options.yml \
    --license_acceptance=true

    Portworx storage

    Run the appropriate command to create the custom resource.

    Default installation (without installation options)
    cpd-cli manage apply-cr \
    --components=watsonx_orchestrate \
    --release=${VERSION} \
    --cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
    --storage_vendor=portworx \
    --license_acceptance=true
    Custom installation (with installation options)
    cpd-cli manage apply-cr \
    --components=watsonx_orchestrate \
    --release=${VERSION} \
    --cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
    --storage_vendor=portworx \
    --param-file=/tmp/work/install-options.yml \
    --license_acceptance=true

    NetApp Trident
    Remember: When you use NetApp Trident storage, both ${STG_CLASS_BLOCK} and ${STG_CLASS_FILE} point to the same storage class, typically ontap-nas.

    Run the appropriate command to create the custom resource.

    Default installation (without installation options)
    cpd-cli manage apply-cr \
    --components=watsonx_orchestrate \
    --release=${VERSION} \
    --cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
    --block_storage_class=${STG_CLASS_BLOCK} \
    --file_storage_class=${STG_CLASS_FILE} \
    --license_acceptance=true
    Custom installation (with installation options)
    cpd-cli manage apply-cr \
    --components=watsonx_orchestrate \
    --release=${VERSION} \
    --cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
    --block_storage_class=${STG_CLASS_BLOCK} \
    --file_storage_class=${STG_CLASS_FILE} \
    --param-file=/tmp/work/install-options.yml \
    --license_acceptance=true

Validating the installation

watsonx Orchestrate is installed when the apply-cr command returns:
[SUCCESS]... The apply-cr command ran successfully

If you want to confirm that the custom resource status is Completed, you can run the cpd-cli manage get-cr-status command:

cpd-cli manage get-cr-status \
--cpd_instance_ns=${PROJECT_CPD_INST_OPERANDS} \
--components=watsonx_orchestrate

What to do next

  1. Complete the mandatory post-installation setup tasks. For more information, see Post-installation setup for watsonx Orchestrate.
  2. To enable the users to access watsonx Orchestrate, see Giving users access to a watsonx Orchestrate instance.