Streaming OMEGAMON data from Kafka to the Elastic Stack
You can use Z Common Data Provider to read and stream OMEGAMON® data from Kafka to the Elastic Stack.
About this task
You must create a policy to stream OMEGAMON data. In the policy, select OMEGAMON data stream, specify the Kafka topic of your OMEGAMON data, and add Elasticsearch as the subscriber. Then copy and update Logstash configuration files to ingest OMEGAMON data to Elasticsearch, set up and start the Data Streamer to start to stream OMEGAMON data from Kafka to the Elastic Stack.
Procedure
-
Create a policy to stream OMEGAMON data
stream.
- Click the Create a new policy box.
- In the resulting Policy Profile Edit window, type or update the required policy name and, optionally, a policy description.
- Click the Add Data Stream icon .
- Select OMEGAMON data stream and click
Select.
OMEGAMON data stream is shown as a node in the graph.
- Click the Configure
icon on the OMEGAMON data stream node to configure Kafka
Topic Name. The value of the File Path will be updated automatically to be consistent with the value of the Kafka Topic Name that you fill in.
- Click the Subscribe
icon on the OMEGAMON data stream node, the Policy Profile
Edit window opens where you can select a previously defined subscriber, or define a new
subscriber by completing the following steps. Note: Do not stream OMEGAMON data and other data to a same logstash.
- In the
Subscribe to a data stream
window, click the Add Subscriber icon . - In the resulting
Add subscriber
window, update the associated configuration values, and click OK to save the subscriber.You can update the following values in theAdd subscriber
window:- Name
- The name of the subscriber.
- Description
- An optional description for the subscriber.
- Protocol
- The streaming protocol that the Data Streamer uses to send data to the subscriber. For example, you can choose Logstash if you want to use the Elastic Stack as a subscriber, or you can choose Splunk via Data Receiver if you want to use Splunk as a subscriber. Make sure you choose the protocol that meets your requirements. For more information about protocols, see Subscriber configuration.
- Host
- The host name or IP address of the subscriber.
- Port
- The port on which the subscriber listens for data from the Data Streamer.
- Auto-Qualify
- This value is ignored.
- In the
Subscribe to a data stream
window, select one or more subscribers, and click Update Subscriptions.The subscribers that you choose are then shown on the graph.
- To save the policy, click Save.
- In the
For more information about creating a policy to stream OMEGAMON data, see Creating a policy to stream OMEGAMON data stream. - Copy the procedure
HBODSPRO
in thehlq.SHBOSAMP
library to a user procedure library.You can rename this procedure according to your installation conventions. - In your copy of the procedure
HBODSPRO
, customize the following parameter values and environment variables for your environment:mode=kafka
- If you specify this parameter, the Data Streamer will consume OMEGAMON data from the Apache Kafka server that is specified in the
KAFKA_SERVER environment variable or the customized Apache Kafka consumer
properties file, and send the data to the subscribers that are defined in the policy. For more
information, see KAFKA_SERVER and KAFKA_PROPER.Note: Even if you specify this parameter, the Data Streamer can still receive the data from the data gatherers. In other words, by specifying
mode=kafka
, the Data Streamer can process data from both the data gatherers and Apache Kafka. - KAFKA_SERVER
- The address list of Apache Kafka bootstrap servers. It is a comma-separated list of host:port pairs.
- KAFKA_PROPER
- The full path for the customized Apache Kafka consumer properties file. The default path is
CDP_HOME/gatherer.consumer.properties.Important: You must specify the address list of Apache Kafka bootstrap servers either in the KAFKA_SERVER environment variable or in the customized Kafka consumer properties file. If you specify both, the value in the KAFKA_SERVER environment variable will take effect.
HBODSPRO
, see Customizing the Data Streamer started task. - If the value of Enable Multiple Kafka Topics for OMEGAMON data streams in the policy is set to
Yes
, copy the sample odp-topics-list.json file from the CDP_SMP_HOME/DS/LIB directory to the CDP_HOME directory. After you copy the odp-topics-list.json file, edit the file to select the OMEGAMON data streams that you want to collect. For more information, see Configuring the odp-topics-list.json file to collect OMEGAMON data from multiple Kafka topics. - Configure Logstash.
- Copy the following configuration files from the Elasticsearch
ingestion kit in the ZLDA-IngestionKit-raw-v.r.m.f.zip file to your Logstash
configuration directory, and update B_CDPz_Omegamon.conf and
Q_CDPz_Omegamon.conf files according to your environment.
- B_CDPz_Omegamon.conf
This file contains the input stage that specifies the TCP/IP port on which Logstash listens for data from the Data Streamer. Specify the port on which Logstash listens for data from the Data Streamer. The default value is 8080.
- CDPz_Omegamon.conf
This file contains the information of how Logstash parses and splits the concatenated JSON data, and a unique field name annotation stage that maps to OMEGAMON data.
- Q_CDPz_Omegamon.conf
This file contains an output stage that sends all records to a single Elasticsearch server. Specify the value of the hosts parameter to the IP address where Elasticsearch is running. The default value is localhost.
Note: To integrate with current OMEGAMON dashboard, do not modify the parameters in the CDPz_Omegamon.conf file unless the OMEGAMON dashboard needs to be updated. - B_CDPz_Omegamon.conf
- Start the Elastic Stack.
- Copy the following configuration files from the Elasticsearch
ingestion kit in the ZLDA-IngestionKit-raw-v.r.m.f.zip file to your Logstash
configuration directory, and update B_CDPz_Omegamon.conf and
Q_CDPz_Omegamon.conf files according to your environment.
- Start the Data Streamer by running the following command.
START HBODSPRO