Configuring an external IFM

External Inferencing Foundation Models (IFM) configuration allows watsonx Orchestrate cluster to communicate with the watsonx.ai cluster. watsonx.ai cluster hosts the LLM’s that are required for generating answers to the questions on AI Assistant. After you install watsonx Orchestrate, configure the external IFM.

Before you begin

  • Ensure that you have created space for the watsonx.ai deployment.
  • Obtain the username and API key for the cluster that contains the watsonx.ai deployment. For more information, see Generating API keys for authentication.
  • Import certificates from the watsonx.ai cluster to enable the IFM connection:
    1. Run the following command to get the certificate that you want to import.
      openssl s_client -connect <target-server>:443 -showcerts
      If you want to connect to a separate cluster running watsonx.ai, use the following command.
      export ROUTE=`oc get route -n cpd cpd -o jsonpath={'.spec.host'}`
      <--- this is the remote cpd cluster WxA will connect to
      openssl s_client -connect ${ROUTE}:443 -showcerts
    2. In the following resultant output, you'll find the certificates that you need .
      -----BEGIN CERTIFICATE-----
      <content of the certificate>==
      -----END CERTIFICATE-----
      1 s:CN = ingress-operator@1721342502
      i:CN = ingress-operator@1721342502
      -----BEGIN CERTIFICATE----
      <content of the certificate>==
      -----END CERTIFICATE-----
    3. Remove the lines between END CERTIFICATE and BEGIN CERTIFICATE and then save the certificates to a file.
      Ensure the following:
      • There is no space when you remove the lines.
      • The file name ends with ".crt". For example, "wac-007.crt".
    4. Copy the certificate file into a system with Red Hat OpenShift access to the watsonx Assistant cluster.
    5. Import the custom certificate by performing the steps provided in Creating a secret to store shared custom certificates.

Procedure

  1. Apply the following secret to the cluster.
    apiVersion: v1
    data:
      api_key: <api_key> (base64 encoded)
    kind: Secret
    metadata:
      name: watsonx-ai-api-key
      namespace: <cpd-namespace>
    type: Opaque
    Where <cpd-namespace> is the namespace of your cluster.
  2. Apply the following configmap to the watsonx Assistant cluster.
    cat <<EOF |oc apply -f -
    apiVersion: v1
    data:
      api_key_secret_name: watsonx-ai-api-key
      url: https://<cpd-route>
      username: <username>
    kind: ConfigMap
    metadata:
      name: watsonx-ai-connection-config
      namespace: <cpd-namespace>
    EOF
  3. Run the following Red Hat OpenShift Console patch command for external IFM configuration on the watsonx Orchestrate cluster.
    oc patch wo wo --type=merge -p '{"spec": {"watsonAssistants": {"config": {"configOverrides": {"watsonx_enabled": true}}}}}'
    After you configure the external IFM, verify that the LLM configuration is working by using the AI Assistant.
  4. Run the following Red Hat OpenShift Console patch command for external IFM configuration on the watsonx Orchestrate cluster:
    oc -n <cpd-namespace> patch wo wo --type=merge -p {"spec": {"watsonAssistants": {"config": {"configOverrides": {"enabled_components": {"store": {"ifm": false }}, "watsonx_enabled": true }}}}}