Streaming OMEGAMON data from Kafka to a different Kafka
You can use Z Common Data Provider to read and stream OMEGAMON® data from Kafka to a different Kafka.
About this task
You must create a policy to stream OMEGAMON data. In the policy, select OMEGAMON data stream, specify the Kafka topic of your OMEGAMON data, and add a different Kafka as the subscriber. Then, set up and start the Data Streamer to start to stream OMEGAMON data from Kafka to a different Kafka.
Procedure
-
Create a policy to stream OMEGAMON data
stream.
- Click the Create a new policy box.
- In the resulting Policy Profile Edit window, type or update the required policy name and, optionally, a policy description.
- Click the Add Data Stream icon
.
- Select OMEGAMON data stream and click
Select.
OMEGAMON data stream is shown as a node in the graph.
- Click the Configure
icon on the OMEGAMON data stream node to configure Kafka Topic Name.
The value of the File Path is automatically to be consistent with the value of the Kafka Topic Name that you enter. - Click the Subscribe
icon on the OMEGAMON data stream node, the Policy Profile Edit window opens where you can select a previously defined subscriber, or define a new subscriber by completing the following steps.
- In the
Subscribe to a data stream
window, click the Add Subscriber icon.
- In the resulting
Add subscriber
window, update the associated configuration values, and click OK to save the subscriber.You can update the following values in theAdd subscriber
window:- Name
- The name of the subscriber.
- Description
- An optional description for the subscriber.
- Protocol
- The streaming protocol that the Data Streamer uses to send data to the subscriber. For example, you can choose Kafka if you want to use Kafka as a subscriber. For more information about protocols, see Subscriber configuration.
- Bootstrap Servers
- This configuration value is valid only when you choose Kafka protocol. It specifies the address
of Kafka bootstrap servers. It is a comma-separated list of host and port pairs. A host and port
pair use
:
as the separator. - Enable Customized Kafka Topic
- This configuration value is valid only when you choose Kafka protocol. It specifies whether the
customized Kafka topic name is enabled or not. The default value is No. If the value is No, the
Kafka topic is created based on the default Kafka topic that you can get from related data stream
property
Default Kafka Topic
. - Prefix
- This configuration value is valid only when you choose Kafka protocol and available only when
Enable Customized Kafka Topic is set to Yes. It specifies the prefix of topic
name. The default value is
IBM-CDP
. For more information, see Configuring Apache Kafka. - Customized Topic
- This optional configuration value is valid only when you choose Kafka protocol and available only when Enable Customized Kafka Topic is set to Yes. It specifies the customized topic name. For more information, see Configuring Apache Kafka.
- Kafka Producer Config
- This optional configuration value is valid only when you choose Kafka protocol. It specifies the file path for the Kafka producer properties file.
- Kafka Consumer Config
- This optional configuration value is valid only when you choose Kafka protocol. It specifies the file path for the Kafka consumer properties file.
- Format
- This configuration value is valid only when you choose Kafka protocol. It specifies the format of data that is sent to the Kafka server. The format can be CSV or Key-Value. The default value is CSV. For more information, see Configuring Apache Kafka.
- Partitioner Class
- This configuration value is valid only when you choose Kafka protocol. It specifies a class to
use to determine which partition to be sent to when produce the records. You can choose any of the
following values. The default value is Default.
- Default
- Indicates that the class setting is defined in the Kafka producer properties file.
- UniformStickyPartitioner
- Indicates that a partition will be stuck to until the batch is full, or linger.ms is up.
- Auto-Qualify
- This value is ignored.
- In the
Subscribe to a data stream
window, select one or more subscribers, and click Update Subscriptions.The subscribers that you choose are then shown on the graph.
- To save the policy, click Save.
- In the
For more information about creating a policy to stream OMEGAMON data, see Creating a policy to stream OMEGAMON data stream. - Copy the procedure
HBODSPRO
in thehlq.SHBOSAMP
library to a user procedure library.You can rename this procedure according to your installation conventions. - In your copy of the procedure
HBODSPRO
, customize the following parameter values and environment variables for your environment:mode=kafka
- If you specify this parameter, the Data Streamer will consume OMEGAMON data from the Apache Kafka server that is specified in the
KAFKA_SERVER environment variable or the customized Apache Kafka consumer
properties file, and send the data to the subscribers that are defined in the policy. For more
information, see KAFKA_SERVER and KAFKA_PROPER.Note: Even if you specify this parameter, the Data Streamer can still receive the data from the data gatherers. In other words, by specifying
mode=kafka
, the Data Streamer can process data from both the data gatherers and Apache Kafka. - KAFKA_SERVER
- The address list of Apache Kafka bootstrap servers. It is a comma-separated list of host:port pairs.
- KAFKA_PROPER
- The full path for the customized Apache Kafka consumer properties file. The default path is
CDP_HOME/gatherer.consumer.properties.Important: You must specify the address list of Apache Kafka bootstrap servers either in the KAFKA_SERVER environment variable or in the customized Kafka consumer properties file. If you specify both, the value in the KAFKA_SERVER environment variable will take effect.
HBODSPRO
, see Configuring the Data Streamer. - Start the Data Streamer by running the following command:
START HBODSPRO