Streaming OMEGAMON Data from Kafka to Splunk via the HEC

You can use Z Common Data Provider to read and stream OMEGAMONĀ® data from Kafka to Splunk via the HTTP Event Collector (HEC).

About this task

You must create a policy to stream OMEGAMON data. In the policy, select OMEGAMON data stream, specify the Kafka topic of your OMEGAMON data, and add Splunk via HEC as the subscriber. Then, set up and start the Data Streamer to start to stream OMEGAMON data from Kafka to Splunk HEC.

Procedure

  1. Create a policy to stream OMEGAMON data stream.
    1. Click the Create a new policy box.
    2. In the resulting Policy Profile Edit window, type or update the required policy name and, optionally, a policy description.
    3. Click the Add Data Stream icon Add Data Stream icon.
    4. Select OMEGAMON data stream and click Select.
      OMEGAMON data stream is shown as a node in the graph.
    5. Click the Configure Configure icon icon on the OMEGAMON data stream node to configure Kafka Topic Name.
      The value of the File Path will be updated automatically to be consistent with the value of the Kafka Topic Name that you fill in.
    6. Click the Subscribe Subscribe icon icon on the OMEGAMON data stream node, the Policy Profile Edit window opens where you can select a previously defined subscriber, or define a new subscriber by completing the following steps.
      1. In the Subscribe to a data stream window, click the Add Subscriber icon Add Subscriber icon.
      2. In the resulting Add subscriber window, update the associated configuration values, and click OK to save the subscriber.
        You can update the following values in the Add subscriber window:
        Name
        The name of the subscriber.
        Description
        An optional description for the subscriber.
        Protocol
        The streaming protocol that the Data Streamer uses to send data to the subscriber. For example, you can choose Logstash if you want to use the Elastic Stack as a subscriber, or you can choose Splunk via Data Receiver if you want to use Splunk as a subscriber. Make sure you choose the protocol that meets your requirements. For more information about protocols, see Subscriber configuration.
        Host
        The host name or IP address of the subscriber.
        Port
        The port on which the subscriber listens for data from the Data Streamer.
        Auto-Qualify
        This value is ignored.
        Compression Type
        A specification of whether to compress data before sending to a Humio subscriber. You can choose any of the following values. The default value is None.
        None
        Indicates that the data will not be compressed before sending to a Humio subscriber.
        GZIP
        Specifies that data will be GZIP compressed before sending to a Humio subscriber.
        Number of threads
        This configuration value is valid only when you choose one of the HEC protocols or CDP Humio protocols. The number of threads that will send data to the subscriber. The default value is 12. The value must range from 1 to 20. For Splunk HEC protocols, generally you don't need to change this value. For CDP Humio protocols, you must change it based on your environment resource. For more information about the environment resources required by Humio, see Preparation for Installing Humio.
        Token
        This configuration value is valid when you choose one of the HEC protocols or one of the CDP Humio protocols. Specifies the token value. For more information about how to create a token value, see Sending data directly to Splunk by using Splunk HEC as the subscriber. For more information about how to create a Humio repository token, see Preparing to send data to Humio via Logstash.
      3. In the Subscribe to a data stream window, select one or more subscribers, and click Update Subscriptions.

        The subscribers that you choose are then shown on the graph.

      4. To save the policy, click Save.
    For more information about creating a policy to stream OMEGAMON data, see Creating a policy to stream OMEGAMON data stream.
  2. Copy the procedure HBODSPRO in the hlq.SHBOSAMP library to a user procedure library.
    You can rename this procedure according to your installation conventions.
  3. In your copy of the procedure HBODSPRO, customize the following parameter values and environment variables for your environment:
    mode=kafka
    If you specify this parameter, the Data Streamer will consume OMEGAMON data from the Apache Kafka server that is specified in the KAFKA_SERVER environment variable or the customized Apache Kafka consumer properties file, and send the data to the subscribers that are defined in the policy. For more information, see KAFKA_SERVER and KAFKA_PROPER.
    Note: Even if you specify this parameter, the Data Streamer can still receive the data from the data gatherers. In other words, by specifying mode=kafka, the Data Streamer can process data from both the data gatherers and Apache Kafka.
    KAFKA_SERVER
    The address list of Apache Kafka bootstrap servers. It is a comma-separated list of host:port pairs.
    KAFKA_PROPER
    The full path for the customized Apache Kafka consumer properties file. The default path is CDP_HOME/gatherer.consumer.properties.
    Important: You must specify the address list of Apache Kafka bootstrap servers either in the KAFKA_SERVER environment variable or in the customized Kafka consumer properties file. If you specify both, the value in the KAFKA_SERVER environment variable will take effect.
    For more information about the customization of the procedure HBODSPRO, see Configuring the Data Streamer.
  4. Start the Data Streamer by running the following command.
    START HBODSPRO

Results

You will receive OMEGAMON data on Splunk HEC.