Creating a policy file to send data to Logstash

You must create a policy file that will send the desired data source types to logstash.

About this task

This section does not apply if you use Apache Kafka as the protocol to send data to the Z Data Analytics Platform or the Elastic Stack platform.

The data source types are contained in the following groups:
IBM Z® Common Data Provider
The data source types in this group equate to raw data and must be sent to the Logstash pipeline zlda-config-raw.
IBM Z Operational Log and Data Analytics
The data source types in this group equate to curated data and must be sent to the Logstash pipeline zlda-config-curated.

Procedure

  1. To send data from the IBM Z Common Data Provider group, create a Logstash subscriber, and configure the subscriber with the host name and port that are defined in the Logstash configuration file B_CDPz_Input.conf in the Logstash pipeline zlda-config-raw. You must connect the data from the IBM Z Common Data Provider to this subscriber.
  2. To send data from the IBM Z Operational Log and Data Analytics group, create a Logstash subscriber, and configure the subscriber with the host name and port that are defined in the Logstash configuration file B_cdpz.conf in the Logstash pipeline zlda-config-curated. You must connect the data from the IBM Z Operational Log and Data Analytics group to this subscriber.
    Important: If you want to send any log data, for example, z/OS® SYSLOG, syslogd, CICS® MSGUSR, CICS EYULOG and NetView® and zSecure Access Monitor logs, and you use the dashboards and saved searches in the IBM Z Operational Log and Data Analytics application, you must send the log data to the Logstash subscriber that is configured with the host name and port that are defined in the Logstash configuration file B_cdpz.conf in the Logstash pipeline zlda-config-curated.