IBM Support

Integration for Maximo Public SaaS

Product Documentation


Abstract

Maximo Public SaaS Allows Outbound Events, Inbound Integration, Data Export

Content

The Maximo Public SaaS provides integration support mostly through REST apis. The REST api document can be found here https://developer.ibm.com/static/site-id/155/maximodev/restguide/Maximo_Nextgen_REST_API.html.

Below is a list of features that are not currently enabled in Public SaaS:

  1. Creation/Modification to Object structures are not allowed. 
  2. The data transformation (exits) and endpoint routing features are not available. 
  3. No support for Web services or XML based processing. Only JSON and CSV support available.
  4. No support for integration scripting.

Outbound Events

Often, we would need to push messages out to an external system when some application objects (Mbos) get created/updated or deleted. You may want to further specify on what condition under the umbrella those broad events (Create/Update/Delete) we want messages to get pushed out. The Maximo SaaS Public allows one to define these events and event condition filters. To do this go to the “Administration” app -> “Integration” Module -> “Outbound Events” tab.

If you have not defined one yet, it will ask you to define a Message Hub Provider. The Kafka provider can host message topics to which the event messages can flow in. This effectively provides a store for the event messages. Maximo SaaS public does not provide a consumer for those topics. Customers are expected to provide the tooling/logic to consume messages from the topics and push it to their destinations.

The Integration app will launch a message hub provider configuration dialog every time one goes to the Integration app if the provider setup has not been completed. A Message Hub Provider is a Kafka service instance (IBM Event Streams) in IBM cloud which one will need to provision from one’s IBM Cloud dashboard. Once provisioned you would need to go to the “service credentials” section of the service (in your IBM Cloud Dashboard) and copy the Kafka broker SASL (kafka_brokers_sasl) and put it in the “Servers” text area for the provider dialog. Also copy the user and password from that service credentials section to the corresponding fields in the provider dialog. On dialog save, the system will verify the connectivity to the Kafka provider.

Once the setup is done, we can start defining events. Events are always defined for an Object Structure which is pre-defined collection of related business objects (Mbos). While defining the events we need to select a topic where the event messages will flow into. The app will provide a lookup of the available topics based on the configured Kafka provider service.

The message published will be a JSON message, the structure of which can optionally be defined using a message template. Message templates provide a view on top of the Object Structure. Message templates can be defined from the 2nd tab in the Integration app. You can associate any existing template (for the event Object structure) to an event. All messages generated for that event would conform to that template.

As discussed earlier, we can further qualify the event using the event filter conditions, which can either be And ed or Or ed.

Inbound Integration        

In Maximo Public SaaS we will use the REST apis to do inbound integration. Since we have the whole deployment protected using OIDC (Open ID Connect) protocol, we recommend users to use the apikey approach for headless integrations (integrations that do not involve a user and a browser).

Apikeys can be generated from the Admin workcenter->Integration->Api Keys tab. We can generate apikeys for any user that belongs to the integration group, which is a security group specifically designated for users with integration role.

With apikeys, the rest apis now need to get invoked at the /maximo/api context. For example if we are creating an asset using the MXAPIASSET object structure

POST /maximo/api/os/mxapiasset?apikey=<the apikey value>&lean=1

properties: *

{

     “assetnum”:”ASSET101”,

     “siteid”:”MYSITE”,

     “description”:”test asset”

}

This will respond back with the newly created asset and the location header which will contain the url of the newly created asset.

The important thing to note here is the use of apikey and the use of /api context (as opposed to the /oslc). It’s the same api stack – just a different context and a different from of authentication to help the headless clients.

Also note that when using apikeys, there is no persistent session and hence we need to send the apikey as part of all requests.

Data Export:

Data export can be done using the REST apis. A sample api for exporting data as csv is shown below

GET /api/os/mxapiasset?oslc.select=assetnum,siteid,status,location&oslc.pageSize=100&oslc.where=status=”OPERATING”&_format=csv&apikey=<apikey>

If we do not include the _format=csv, the default is json.

Note you can always login to the system using the browser and then use another tab in the browser to make these requests, without needing to use apikey. Apikey is explicitly needed when you are operating in a browserless interaction mode.

Data Import:

Data import can be done using json or csv formats. It can be done synchronously or asynchronously. Below is an example of csv import:

POST /maximo/api/os/mxapiasset?action=importfile

<csv file content for mxapiasset>

This should import the files synchronously. The response should look like

{

     “validdocs”:10

}

Assuming there were 10 records in that csv file.

Now this needs the object structure to be flat file enabled. IBM would attempt to make most of the common master data kind of object structures – like MXAPIITEM, MXAPIASSET, MXAPIOPERLOC to be flat file enabled.

Another way to load the data would be avoid this csv and flat file enablement and just use json to do the same. In the above example you can send in the same data using a json array

POST /maximo/api/os/mxapiasset?action=importfile

Filetype: JSON

[

    {

         “assetnum”:”T1”,

         “siteid”:”S1”

    },

   {

         “assetnum”:”T2”,

         “siteid”:”S1”

    }

]

To attempt an asynchronous load of the same data, we just need to modify the above apis to add the file name and the async flag as query parameters. The example below loads a csv file asynchronously.

POST /maximo/api/os/mxapiasset?action=importfile&name=testload1.csv&async=1

<csv file content for mxapiasset>

In this case, you will need to provide a unique value to the name query parameter for every load, effectively loading a new unique file every time. The data is syntax valid before getting stored in an internal repository. A cron task (APIFILEIMPORT) which runs periodically (every 30 mins by default - customer can set the frequency) would pick up the files uploaded and process the content. As a response to the upload, we will have a location url that can provide the status of the update as it progresses. Also if there is an error, the response to the “location” url will contain a error file url embedded. That url can be used to download the error file. Customer may fix the error csv file and upload the csv content again – under a new name. This feature is also available for the json upload. The json upload however needs the request header to specify the filetype to JSON.

A sample status json with error is shown below. Note that it will say the errocount (1) and the errofile url that will let the user download the error file to fix and retry (with a different name query parameter).

{

       "iscomplete": true,

       "totalcount": 3,

       "errcount": 1,

       "requser": "WILSON",

       "fileimportreqqueueid": 3,

       "format": "JSON",

       "errorfile": "http://host:port/maximo/api/fileimporterrfile/3",

       "_rowstamp": "1521573",

       "iscancelled": false,

       "reqdatetime": "2019-02-20T14:08:22-05:00",

       "name": "testloc3.json",

       "href": "http://host:port/maximo/api/os/mxapifileimportqueue/_dGVzdGxvYzMuanNvbg--",

       "pindex": 3,

       "osname": "MXAPIOPERLOC"

}

Document Location

Worldwide

[{"Business Unit":{"code":null,"label":null},"Product":{"code":"SSBLW8","label":"IBM Maximo Enterprise Asset Management SaaS"},"Component":"","Platform":[{"code":"PF002","label":"AIX"},{"code":"PF016","label":"Linux"},{"code":"PF033","label":"Windows"}],"Version":"Maximo EAM SaaS","Edition":"","Line of Business":{"code":"LOB59","label":"Sustainability Software"}}]

Document Information

Modified date:
10 June 2019

UID

ibm10887279