Deploying the Z Operational Log and Data Analytics application on a single Splunk Enterprise system

The advantage of deploying the IBM Z® Operational Log and Data Analytics application on a single Splunk Enterprise system is that the deployment is simple and quick.

About this task

The steps in this procedure must be done on the system where the web browser is running rather than on the Splunk Enterprise server.

Procedure

To deploy the IBM Z Operational Log and Data Analytics application, complete the following steps:

  1. If you are using the Z Common Data Provider Data Receiver as a subscriber, install and configure the Data Receiver.
    Important: The Data Receiver working directory and output directory must also be available to Splunk. If you want to set these directories as environment variables, verify that the Data Receiver working directory is assigned to the environment variable CDPDR_HOME, and that the Data Receiver output directory is assigned to the environment variable CDPDR_PATH, as described in Setting up a working directory and an output directory for the Data Receiver. If you do not want to change your system environment variables, you can specify CDPDR_HOME and CDPDR_PATH in SPLUNK_HOME/etc/splunk-launch.conf.
  2. Start the Data Receiver, as described in Running the Data Receiver.
  3. Define a policy with the Data Receiver as the subscriber.
  4. Mount the IBM Z Operational Log and Data Analytics ISO installation image, or extract the IBM Z Operational Log and Data Analytics .tar file.
    For more information about how to get the package, see Obtaining and preparing the installation files.
  5. Log in to Splunk.
  6. From the Splunk Web Home page, click the gear icon that is next to the word Apps.
  7. Select Install app from file.
  8. Navigate to the ISO image, select the ibm_zlda_insights.spl file, and click Upload.
  9. If you are prompted to restart Splunk Enterprise server, restart it.
  10. Verify that the application is shown in the list of apps and add-ons.
    The application is also in the following directory on the Splunk Enterprise server:
    $SPLUNK_HOME/etc/apps/ibm_zlda_insights
  11. Optional: Install the Splunk IT Service Intelligence (ITSI) content pack for IBM Z Operational Log and Data Analytics.

Results

You can see the data that is loaded into Splunk by using a simple search. For example, the following search shows you all ingested z/OS® SYSLOG events in the zos-syslog-console index:
index=zos-syslog-console sourcetype=zOS-SYSLOG-Console

If you expand an event, you can see the individual fields for which extraction rules are set.

The following search example shows you the z/OS SYSLOG messages that are issued by the CICS35 job that is running on your production sysplex and are in the zos-syslog-console index:
index=zos-syslog-console sysplex=PRODPLEX jobname=CICS35 sourcetype=zOS-SYSLOG-Console
You can also use Splunk analytics tools to analyze the data, or write your own deep analysis tools.
Tip: Currently the Splunk application supports only the following log data types for indexing:
  • z/OS SYSLOG
  • SMF
  • IMS
  • RMF III
  • CICS® EYULOG
  • CICS MSGUSR
  • WebSphere® SYSOUT
  • WebSphere SYSPRINT
  • USS Syslogd
  • NetView® Netlog
  • zSecure
Searches for other types of data will not yield any results, although the data is in the output directory that is specified by the environment variable CDPDR_PATH. To use this data in the Z Common Data Provider, you can edit the Splunk application, which is installed in the directory SPLUNK_HOME/etc/apps/ibm_zlda_insights/. More dashboards and indexing capabilities are available in IBM Z Operational Log and Data Analytics.

Splunk indexers can generally ingest data up to 300 GB per day. Further data volumes require multiple indexers and search heads. See recommendations of Splunk on scaling and capacity planning for more information.

What to do next

If you are using the Splunk HTTP Event Collector (HEC) as the subscriber (as indicated in Subscriber configuration), also complete the steps in Sending data directly to Splunk by using Splunk HEC as the subscriber.