Installing Apache Kafka
Apache Kafka provides a buffer for messages that are sent to and received from external interfaces. Apache Kafka is not required if the IBM® Maximo® Manage software is not interfacing with external systems.
About this task
Apache Kafka is required by IoT and is optional for Manage
Procedure
What to do next
- Configure Maximo Application Suite parameters
- Now you are ready to configure Apache Kafka details.
- In the Maximo Application Suite instance, login to the Administration dashboard.
- in , select Apache Kafka. The
following information is needed to configure the Apache Kafka details:
Hosts/HostnamesUsername/passwordCertificates
Hosts - to obtain the bootstrap hosts, in the Red Hat OpenShift console.
- In the Kafka project, go to
and
search for the route
kafka-kafka-tls-bootstrap. - Copy the value in the host field.
For example, kafka-kafka-tls-bootstrap-kafka.<yourdomain.com>.
- The port number in the external route is 443. Enter the host name and port values in the hosts section.
- To obtain the Kafka user's password, in the Red Hat OpenShift Kafka project, go to . Search for your Kafka user, i.e. masuser. The data section will contain the user's password.
- To obtain the certificates, during configuring the Apache Kafka
parameters, click the Retrieve option to automatically retrieve the certificates from the Kafka bootstrap host. Alternatively, you can obtain the certificate details prior to configuring the Maximo Application Suite parameters to enter the information manually.
- To do this, in the Red Hat OpenShift console, switch to the Kafka project, and then navigate to Custom Resource Definitions. Search for Kafka.
- Click the Instances tab and select your instance in the Kafka namespace.
- Click YAML view. From this view, you can copy the certificate. The certificate will have BEGIN
CERTIFICATE and END CERTIFICATE tags which must be included.
-----BEGIN CERTIFICATE----- MIIDLTCCAhWgAwIBAgIJANfi6SPho4cIM... -----END CERTIFICATE-----
- Copy the certificate text to be added in the Maximo Application Suite UI.
- Log in to the Maximo Application Suite Admin User Interface and go to Administration.
- Click Configurations
- Click Apache Kafka
- Add the tls bootstrap host name and port:
For example,
Host Portxxx.xxx.xxx.xx.com 443 - Enter the username and password
- Enter an alias name. For example,
strimzi - Add the copied certificate(s), and set the alias name then click confirm.
- Click Save.
- To confirm that the configuration is successful, in the OpenShift console, navigate to . Search for kafkacfg. Click the Instances tab. Click your instance, then view the YAML for any success or failure messages.
- IBM Event Streams
-
Event Streams is an alternative for AMQ Streams for Kafka dependency which is available in IBM Cloud® catalog.
- To install an Event Streams instance in IBM Cloud, login to your IBM Cloud account, go to Catalog and search for "Event Streams". Once you Click the "Event Streams" tile, go to Create tab and you will get to the provisioning details page where you will have to enter information regarding your Event Streams instance.
- Location - It is recommended to choose a location that is close to the server/cluster location of your Maximo Application Suite instance for improved network performance.
- Pricing Plan - Choose the plan that best fits your expected Kafka usage.
- Resource details - Enter a Service name (it can be any unique name), and the optionally enter more details such as IBM Cloud resource group, and tags.
- Review the summary of your Events Streams instance, review and accept the license agreement terms and click Create.
- Name: Unique name for your service credential
- Example: Service credentials-1
- Role: Defined the level of permissions for your Event Streams instance
- Example: Manager (default)
- kafka_brokers_sasl - Contains 6 hostnames for available Kafka brokers of your Event Streams instance.
- user - Kafka username, default is token
- password - Kafka password
- Suite Configuration Parameters for event streams
- Now you are ready to configure Event Streams into Maximo Application Suite.
- Login to the Suite Administration dashboard of your Maximo Application Suite instance, go to .
- Select Apache Kafka.
- Enter the following information to configure Events Streams as a Kafka service for Maximo Application Suite:
- Hosts/Hostnames - Add a row for each of the six Kafka broker
hostnames provided in the Event Streams service credential. Note: Make sure you do not copy the port. Copy the Kafka broker hostname.For example,
broker-0-<your-event-streams-broker-id>.kafka.svc07.us-south.eventstreams.cloud.ibm.com broker-1... .... broker-6-<your-event-streams-broker-id>.kafka.svc07.us-south.eventstreams.cloud.ibm.com - Port - Enter the port associated to the kafka broker hostnames provided in the Event Streams
service credential.
For example, 9093
- SASL Mechanism - Select plain. This is the default authentication mechanism for Event Streams.
- Username - Enter the user provided on Event Streams service credential.
- Password - Enter the password provided on Event Streams service credential.
- Certificates - Enter the chain of SSL certificates for your Event Streams instance.
- Click Add to add the intermediate of the certificate chain.
- Enter an alias.
For example, kafkacertpart1
- Enter the Certificate content. Here you will include the Let's Encrypt R3 intermediate
certificate, issued to US, Let's Encrypt, R3. For more information about
certificate content , see here.For example,
-----BEGIN CERTIFICATE----- MIIF5jCCBM6gAwIBAgISA0Y... -----END CERTIFICATE-----
- Hosts/Hostnames - Add a row for each of the six Kafka broker
hostnames provided in the Event Streams service credential.
- Click Confirm. The first part of this certificate should have valid dates and look like the
following example:
Issued to: US, Let's Encrypt, R3 Issued by: US, Internet Security Research Group, ISRG Root X1 Valid from: Thu Sep 01 2022 Valid to: Mon Sep 15 2025This is the intermediate certificate which is required for the SSL connection to Event Streams endpoint.
- Click Add to add the root of the certificate chain.
- Enter an alias.
For example, kafkacertpart2
- Enter the Certificate content. Here you will include the ISRG Root X1 cross-signed certificate,
issued to US, Internet Security Research Group, ISRG Root X1. For more
information about certificate content, see here.
-----BEGIN CERTIFICATE----- MIIFazCCA1OgAw... -----END CERTIFICATE----- - Click Confirm. The second part of this certificate should have valid dates and look like the
following example:
Issued to: US, Internet Security Research Group, ISRG Root X1 Issued by: US, Internet Security Research Group, ISRG Root X1 Valid from: Thu Jun 04 2015 Valid to: Mon Jun 04 2035This is the root certificate which is required for the SSL connection to Event Streams endpoint.
- Save the Apache Kafka configuration.
Now, wait for the Apache Kafka configuration to reconcile, this process might take up to 10 minutes. The configuration will be successfully completed when the configuration status is set to Ready.
Configuration Ready - Kafka configuration was successfully verified