Consuming messages from Kafka topics
You can use the KafkaConsumer node to receive messages that are published on a Kafka topic.
Before you begin
About this task
You can use a KafkaConsumer node in a message flow to subscribe to a specified topic on a Kafka server. The KafkaConsumer node then receives messages that are published on the Kafka topic, as input to the message flow.
You can use a KafkaProducer node to publish messages from your message flow to a topic that is hosted on a Kafka server, and you can use a KafkaRead node to read an individual message on a Kafka topic. For more information about using these nodes, see Producing messages on Kafka topics and Reading an individual message from a Kafka topic.
Each KafkaConsumer node consumes messages from a single topic; however, if the topic is defined to have multiple partitions, the KafkaConsumer node can receive messages from any of the partitions. For more information about partitions in Kafka topics, see the Apache Kafka documentation.
The KafkaConsumer node reads messages from Kafka non-transactionally, which means that, if an error occurs or the message is rolled back to the input node, and no catch terminal is connected, the message is not reprocessed by the input node.
In order to process messages that are received concurrently, you can configure additional instances on the KafkaConsumer node. When additional instances are configured, a single Kafka message consumer is created, and the messages are distributed to the additional flow instances. As messages are processed concurrently, message ordering is not preserved when additional instances are being used. For more information about specifying additional instances, see KafkaConsumer node.
You can also increase concurrency by deploying multiple KafkaConsumer nodes that share the same Group ID; Kafka ensures that messages that are published on the topic are shared across the consumer group. For more information about how Kafka shares the message across multiple consumers in a consumer group, see the Apache Kafka documentation.
You can use Kafka custom header properties to add metadata to Kafka messages for use
during message processing. These properties are set in the LocalEnvironment, in a folder called
KafkaHeader
. For more information, see Setting and retrieving Kafka custom header properties.
Procedure
Complete the following steps to receive messages that are published on a Kafka topic: