Configuring an external IFM
External Inferencing Foundation Models (IFM) configuration allows watsonx Orchestrate cluster to communicate with the watsonx.ai cluster. watsonx.ai cluster hosts the LLM’s that are required for generating answers to the questions on AI Assistant. After you install watsonx Orchestrate, configure the external IFM.
Before you begin
- Ensure that you have created space for the watsonx.ai deployment.
- Obtain the username and API key for the cluster that contains the watsonx.ai deployment. For more information, see Generating API keys for authentication.
- Import certificates from the watsonx.ai cluster to enable the IFM connection:
- Run the following command to get the certificate that you want to
import.
If you want to connect to a separate cluster runningopenssl s_client -connect <target-server>:443 -showcerts
watsonx.ai
, use the following command.export ROUTE=`oc get route -n cpd cpd -o jsonpath={'.spec.host'}` <--- this is the remote cpd cluster WxA will connect to openssl s_client -connect ${ROUTE}:443 -showcerts
- In the following resultant output, you'll find the certificates that you need
.
-----BEGIN CERTIFICATE----- <content of the certificate>== -----END CERTIFICATE----- 1 s:CN = ingress-operator@1721342502 i:CN = ingress-operator@1721342502 -----BEGIN CERTIFICATE---- <content of the certificate>== -----END CERTIFICATE-----
- Remove the lines between END CERTIFICATE and BEGIN CERTIFICATE and then save the certificates to
a file. Ensure the following:
- There is no space when you remove the lines.
- The file name ends with ".crt". For example, "wac-007.crt".
- Copy the certificate file into a system with Red Hat OpenShift access to the watsonx Assistant cluster.
- Import the custom certificate by performing the steps provided in Creating a secret to store shared custom certificates.
- Run the following command to get the certificate that you want to
import.