How to use IBM App Connect with Kafka

Kafka is a real-time event streaming platform that you can use to publish and subscribe, store, and process events as they happen. IBM® App Connect provides a Kafka connector that you can use to connect to various supported Kafka implementations.

You can use App Connect to connect to a Kafka broker and configure an integration flow that gets triggered whenever a message is received on the configured topic. You can also create an integration flow that puts messages to a configured topic.

Availability:
  • A connector in IBM App Connect Enterprise as a ServiceApp Connect Enterprise as a Service connector
  • A local connector in a Designer instance of IBM App Connect in containers (Continuous Delivery release)Local connector in containers (Continuous Delivery release) 11.0.0.11-r1 or later
  • A local connector in a Designer instance of IBM App Connect in containers (Long Term Support)Local connector in containers (Long Term Support release)
  • A local connector in a Designer instance of IBM App Connect in containers (Support Cycle 2)Local connector in containers (Long Term Support Cycle-2 release)

The following information describes how to use App Connect to connect to Kafka.

Supported product and API versions

To find out which product and API versions this connector supports, see Detailed System Requirements on the IBM Support page.

What to consider first

Before you use App Connect Designer with Kafka, take note of the following considerations:

App Connect supports connection to the following Kafka implementations:

Connecting to Kafka

Kafka supports various security measures to connect to a cluster. The default setting is nonsecured, but you can have a mixture of unauthenticated, authenticated, encrypted, and nonencrypted channels. When you connect to your Kafka implementation in App Connect, you need to select an authorization method that reflects how your brokers are configured. You might need to speak to your Kafka administrator to get the values that are needed to connect in App Connect.

One of the following four authorization methods is needed to authenticate your connection. Each method has its own set of credentials for connecting to Kafka as displayed in Table 1. If you want to connect to a Kafka schema registry, you need to supply additional credentials as described in Connecting to a Kafka schema registry.

PLAINTEXT
The default setting for Kafka communication. Select this option for authentication by an unauthenticated and nonencrypted channel. If your brokers are configured to encrypt communication, one of the following options is needed.
SASL_PLAINTEXT
Select this option for authentication by Simple Authentication and Security Layer (SASL) (a username and password on a nonencrypted channel).
SASL_SSL
Select this option for authentication by Simple Authentication and Security Layer (SASL) (a username and password on a Secure Sockets Layer (SSL) encrypted channel).
SSL
Select this option for authentication by an SSL encryption channel.
Table 1.
  PLAINTEXT SASL_PLAINTEXT SASL_SSL SSL
Kafka brokers list Required Required Required Required
Client ID Optional Optional Optional Optional
Username N/A Required Required N/A
Password N/A Required Required N/A
Security mechanism N/A Required Required N/A
Server CA certificate N/A N/A Optional Optional
Client key N/A N/A Optional Optional
Client key password N/A N/A Optional Optional
Client certificate N/A N/A Optional Optional
Schema registry type Optional Optional Optional Optional
Schema registry REST API URL Optional Optional Optional Optional
Schema registry username Optional Optional Optional Optional
Schema registry password Optional Optional Optional Optional
Schema registry CA certificate Optional Optional Optional Optional
Private network connection Optional Optional Optional Optional
The following credentials are requested.
Tip: For more information about using Apache Kafka, see the Kafka Documentation page.
Kafka brokers list
Specify the list of Kafka brokers in the format ["x.x.x.x:9092","y.y.y.y:9092"]. For example, [\"192.1.0.12:9092\",\"192.1.0.68:9092\"].
Client ID
Specify a default client ID for all the producer and consumer instances that are associated with this account.
Username
Specify the Kafka server username.
Password
Specify the Kafka server password.
Security mechanism
Specify how the broker is configured to accept secure connections. You need to specify the Salted Challenge Response Authentication Mechanism (SCRAM). The options include PLAIN, SCRAM-SHA-256, or SCRAM-SHA-512 security layers.
Server CA certificate
Specify the Certificate Authority (CA) certificate in PEM format to use for server side authentication. Mutual TLS (mTLS) authentication requires a Server CA certificate and a Client certificate.
Note: The CA certificate must not be Base64 encoded.
Client key
The user-generated private key in PEM format to use for client-side authentication. Mutual TLS (mTLS) authentication requires a Server CA certificate and a Client certificate.
Client key password
The password for the user-generated client key. Required only if the Client key is protected by a password.
Client certificate
The user-generated client certificate in PEM format to use for client-side authentication.
Schema registry type
Specify the schema registry. If not specified, the schema registry type defaults to Confluent.
Schema registry REST API URL
Specify the REST API URL of the schema registry in the format http[s]://<hostname|ip address>:<port>; for example: https://192.168.0.1:9001.
Schema registry username
Specify the username of the schema registry.
Schema registry password
Specify the password of the schema registry.
Schema registry CA certificate
Specify the Certificate Authority (CA) certificate of the schema registry in PEM format.
Private network connection

Select the name of a private network connection that App Connect uses to connect to your private network. This list is populated with the names of private network connections that are created from the Private network connections page in the Designer instance. You see this field only if a switch server is configured for this Designer instance. For more information, see Connecting to a private network from App Connect Designer. (In App Connect Designer 12.0.10.0-r1 or earlier instances that include this field, the display name is shown as Agent name.)

To connect to a Kafka endpoint from the App Connect Designer Connect > Applications and APIs page for the first time, expand Kafka, then click Connect.

Before you use the account that is created in App Connect in a flow, rename the account to something meaningful that helps you to identify it. To rename the account on the Applications and APIs page, select the account, open its options menu (⋮), then click Rename Account.

General considerations for using Kafka in App Connect

  • You can add a New message event node to an event-driven flow to trigger the flow when a new message is received on a Kafka topic. To configure App Connect to detect the event, select the topic, and then optionally specify a message offset, client ID, group ID, and output schema.
    Kafka "New message" event node
    Message offset

    Specify the message offset that the flow (consumer) uses when it starts for the first time (with a specific group ID).

    • Select Earliest to start reading published messages (including historical messages) from the beginning of the topic.
    • Select Latest (the default) to read only messages that are published after the flow starts.
    Note: If you stop the flow and restart it later (with the same group ID), the flow resumes reading messages from where it left off regardless of the Message offset setting.

    Client ID

    Specify a unique identifier that can be used to help trace activity in Kafka. This client ID overrides any client ID that is specified in the credentials for the selected Kafka account for this event node.

    Group ID

    Specify a unique ID for a consumer group.

    You might have more than one flow that listens for new messages on the same topic. When a new message is received on a topic, you can use the Group ID field to define how App Connect consumes that message.

    • If you leave the Group ID field empty, all the flows are triggered when a new message is received on the specified topic. (An ID value of flowName_message is automatically generated.)
    • If you assign the same group ID to all the flows, the throughput of messages from Kafka is shared such that only one flow is triggered when a new message is received on the specified topic. This behavior might be useful when you don’t want all the flows to run at the same time. For instance, for scaling purposes, if one flow goes down then the other flow is triggered.
    • If you assign different group IDs in each flow, all the flows are triggered when a new message is received on the specified topic.
    Select output schema

    If you have configured a schema registry, click Select output schema to select a schema type (AVRO or JSON), subject, and schema version. For more information, see Connecting to a Kafka schema registry.

  • (General consideration) You can see lists of the trigger events and actions that are available on the Applications and APIs page of the App Connect Designer.

    For some applications, the events and actions depend on the environment and whether the connector supports configurable events and dynamic discovery of actions. If the application supports configurable events, you see a Show more configurable events link under the events list. If the application supports dynamic discovery of actions, you see a Show more link under the actions list.

  • (General consideration) If you are using multiple accounts for an application, the set of fields that is displayed when you select an action for that application can vary for different accounts. In the flow editor, some applications always provide a curated set of static fields for an action. Other applications use dynamic discovery to retrieve the set of fields that are configured on the instance that you are connected to. For example, if you have two accounts for two instances of an application, the first account might use settings that are ready for immediate use. However, the second account might be configured with extra custom fields.

Events and actions

Kafka events

These events are for changes in this application that trigger a flow to start completing the actions in the flow.

Note: In containers, only local accounts can be used for these events.
Messages
New message

Kafka actions

Your flow completes these actions on this application.

Messages
Send message
Topics
Retrieve topics