Maximo AI Service
Maximo AI Service is an integrated add-on for Maximo Application Suite that enables select AI features. In Maximo Manage, Maximo AI Service enables an AI assistant, field value recommendations, including problem code recommendations for work orders, locating similar work orders, and AI recommendations in Reliability Strategies.
The AI broker, which was introduced in 9.0, is replaced with Maximo AI Service as of 1 August 2025. To continue using the features that were enabled by the AI broker after that time, you must uninstall any instance of the broker and then deploy and use Maximo AI Service 9.1. You can deploy Maximo AI Service 9.1 with Maximo Application Suite 9.0 or 9.1. If Maximo AI Service is deployed with Maximo Application Suite 9.0, you can use only the AI features that were included in Maximo Application Suite 9.0. For more information about uninstalling the AI broker, see Uninstalling the AI broker.
Overview
- Managing configuration, training, and retraining AI models and retain data during training.
- Delegating inferencing jobs to watsonx AI or to a local embedded runtime.
- Completing health checks of the AI model runtime and the individual models.
- Full SaaS, which means that Maximo AI Service, Maximo Application Suite, and watsonx.ai are all SaaS.
- Full on-premises, which means that Maximo AI Service, Maximo Application Suite, and watsonx.ai are deployed on-premises. In this case, watsonx.ai might run in its on on-premises cluster.
- Hybrid, which means that Maximo Application Suite is deployed on-premises but Maximo AI Service and watsonx.ai are both SaaS.
For watsonx.ai to use AppPoints, Maximo AI Service and watsonx.ai must use the same deployment model. If Maximo AI Service is deployed on‑premises, but watsonx.ai is SaaS, watsonx.ai cannot use AppPoints; Maximo AI Service can use AppPoints, but you must procure and pay for watsonx.ai separately.
You cannot track AppPoint usage for Maximo AI Service in Maximo Application Suite licensing dashboards.
Maximo AI Service supports multitenancy. Model inferencing and training support only the English language. Data that is not in English cannot be processed as part of inferencing or used to generate output.
To enable Maximo AI Service in production, development, and testing environments, you must enable Maximo AI Service individually in each environment type. Maximo AI Service requires a unique tenant ID per environment.
Maximo AI Service also enables some AI features in Maximo IT. For more information, see Integrating AI with Maximo IT.
Deploying Maximo AI Service on-premises
You can deploy Maximo AI Service on-premises with Maximo Application Suite 9.0 or 9.1. For example, you can deploy Maximo AI Service 9.1 on-premises with Maximo Application Suite 9.0. Maximo AI Service on-premises cannot be used with watsonx.ai SaaS.
To deploy Maximo AI Service 9.1 on-premises and then enable the AI features, complete the following steps:
- If you installed the AI broker, uninstall the broker and MariaDB. For more information, see Uninstalling the AI broker.
- If you started a deployment for Maximo AI
Service 9.1 and have MariaDB
as part of that deployment, uninstall that version of MariaDB. Earlier versions of Maximo AI
Service required MariaDB.
- Open Red Hat® OpenShift® web console.
- From the side navigation, click .
- Search for the mariadb project name.
- For the project, click the three-dot menu and then click Delete Project.
- Deploy Maximo AI
Service.
To deploy Maximo AI Service, you must first set up and configure the prerequisite software, including watsonx.ai. You can then complete the deployment by using a CLI or Ansible® collection, connect Maximo AI Service to Maximo Manage, and then verify that Maximo AI Service is running and connected. For more information, see Deploying Maximo AI Service on-premises.
- Create AI configurations for the AI features that you want to enable.
You create AI configurations in Maximo Manage in the AI configuration application. Each AI feature that you want to enable requires its own configuration. For more information, see AI features.
Deploying Maximo AI Service SaaS
To deploy Maximo AI Service SaaS, contact your IBM representative. You can deploy Maximo AI Service SaaS with Maximo Application Suite on-premises or SaaS. Maximo AI Service SaaS cannot be used with watsonx.ai on-premises.
- In Maximo Manage, open the System Properties application.
- Search for, select, and then add global values for the following properties:
- mxe.int.aibrokerapikey. The value is the Maximo AI Service API key.
- mxe.int.aibrokerapiurl. The value is the Maximo AI Service URL.
- mxe.int.aibrokertenantid. The value is the Maximo AI Service tenant ID.
- After you edit each property, in the Common Actions menu, click Save Property.
- After you edit all properties, in the Common Actions menu, click Live Refresh.
Deploying Maximo AI Service in the feature channel
You can deploy Maximo AI Service in the on-premises Maximo Application Suite feature channel. The deployment process for Maximo AI Service in the feature channel is the same as the deployment process for Maximo AI Service on-premises. For more information, see Deploying Maximo AI Service on-premises.
For more information about the feature channel, see What's new in the Maximo Application Suite feature channel.
AI features
The following table contains the available AI features, associated model template name, the required model, what product the feature is used in, and links to documentation that describes how to set up the AI configuration.
If you are already using Granite™ 3.2 8B Instruct for mcc, pcc, fmea, and nl2oslc templates, you must move to the gpt-oss-120b model. For more information, see Changing to gpt-oss-120b models.
| Feature | Model template | Models | Used in | Instructions |
|---|---|---|---|---|
|
Problem code recommendations for work orders |
pcc |
Maximo Manage |
Enabling recommended problem codes for Work orders Model training can use significant resources. Ensure that your Red Hat OpenShift cluster can handle the load. Inferencing occurs locally in the cluster and does not consume as significant resources as training. |
|
|
Field value recommendations |
mcc |
Maximo Manage |
Enabling field value recommendations Model training can use significant resources. Ensure that your Red Hat OpenShift cluster can handle the load. Inferencing occurs locally in the cluster and does not consume as significant resources as training. |
|
|
AI assistant |
nl2oslc |
Maximo Manageand Maximo Health | Enabling the assistant | |
|
Locating similar work orders |
similarity |
|
Maximo Manage | Enabling locating of similar work orders |
|
AI recommendations for asset boundary and the failure list in Reliability Strategies |
fmea |
Maximo Manage | Enabling AI recommendations in Reliability Strategies | |
|
AI insights into asset condition Note: This feature is available in the feature channel. In Maximo Application Suite, customer-managed users can use the feature channel to update their
nonproduction instances to preview this feature. In Maximo Application Suite as a
Service, you can use this
feature in your Maximo Application Suite as a
Service environment. For more information, see What's new in the Maximo Application Suite feature channel.
|
insightsgenerator | Maximo Manage | Enabling AI insights for assets |