Integration by using Apache Kafka
You can integrate with the Apache Kafka messaging platform to publish and consume messages asynchronously from the Kafka platform. Kafka can be configured as an external message provider that accepts and stores outbound and inbound messages.
Introduction to Kafka integration
The product provides the Kafka messaging client that supports consuming, producing, and browsing Kafka topics. To use Kafka features in the product, you must obtain and configure a Kafka server instance of the provider. You can choose any Kafka provider that is compatible with the Kafka 2.8.x client libraries.
The Kafka provider manages the topics and the corresponding partitions and storage that persist the integration messages. The integration framework provides the Kafka client framework to read, write, and browse messages in Kafka topics.
The integration framework uses the point-to-point delivery of Kafka messages. Thus, all Kafka topics are used by the integration framework as a messaging queue. Only one consumer group per topic is used to consume messages. A separate consumer group is used for browsing. However, one other consumer group per topic is available for browsing topics.
The integration framework uses only messaging features of Kafka and does not use any of the Kafka streaming features.
The integration framework supports asynchronous message processing that uses Kafka providers similarly to the way that the integration framework uses JMS. Kafka topics persist messages to a data store for outbound publish or notification channel messages and inbound enterprise service messages. In both cases, sequential and continuous message processing is supported.
Maximo Manage components for Kafka message queues
- Maximo Manage cron tasks for consuming messages from the queues.
- The integration framework for file loading, and the interface table cron task and servlet that write, or produce, messages to queues.
- External Systems application configuration to support registration of the Kafka provider and queue processing.
- Maximo Manage REST API for Kafka that enables browsing messages in the queue. Kafka messages cannot be deleted from Maximo Manage. It is better to leverage the Message Reprocessing application to remove Kafka messages, which in effect just skips the message for the next one.
Differences between JMS queue processing and Kafka queue processing
If you are considering changing from using JMS queues to using Kafka queues for message processing, though the processing features in Kafka are similar, be aware of some differences. Many of the following differences are discussed in more detail in other topics in the Kafka content.
- In Maximo Manage, a Kafka message type is always byte and not text. JMS messages can be text or byte.
- A Kafka message is always compressed.
- The Kafka message size limit is configured in the mxe.kafka.messagesize system property in Maximo Manage. Set its value in bytes to match the value that is used on the Kafka server for Maximo Manage.
- Kafka messages are not deleted when they are processed. They are removed from the queue after the configured retention time is exceeded.
- Continuous message consumption is managed by Maximo Manage cron tasks and not by message-driven beans (MDBs).
- The Maximo Manage REST API is used for browsing messages in a queue.