Configuring the Data Collector to collect SMF data in batch mode

This topic provides step by step instructions on how to configure the Data Collector to collect System Management Facilities (SMF) data in batch mode.

About this task

You need to customize the following files to configure the Data Collector:
  • Configuration file:
    • <policy>.collection-config.json, which is generated through the Configuration Tool
    • (optional) application.properties
  • Batch JCL: HBODCBAT

Procedure

  1. Generate the <policy>.collection-config.json file through the Configuration Tool. For more detailed instructions, see Creating a policy for streaming data to Apache Kafka through Data Collector.
    1. From the Z Common Data Provider Configuration Tool, click the Create a policy box under section Policy for streaming data to Apache Kafka through Data Collector.
    2. In the resulting Policy Profile Edit window, type or update the required policy name and, optionally, a policy description.
    3. Click the Configure data resources box to open the Configure Data Resources window.
    4. Click the ADD SYSTEM button to open the 'Add system' dialogue. Specify System name and Bootstrap servers, and then click OK.
      System name
      The name of the system in the sysplex.
      Bootstrap servers
      A comma-delimited list of host:port pairs to use for establishing the initial connections to the Kafka cluster.
    5. In the system section, click the Add data resource icon to add a data resource.
      1. Select SMF from the drop-down list of Data source.
      2. Specify other parameters as needed.
      3. Click OK to save your settings.
    6. Click the edit edit icon at the end of the resource to edit topic name for SMF data.
      In the resulting Configure topic name and topic name group window, define the topic name for SMF data. The default value is IBM-CDP-zOS-SMF-{plexName}-{sourceType}, of which, plexName will be parsed to sysplex name, and sourceType will be parsed to SMF integer type. The topic name applies to all SMF data resources in the same system.
    7. When you finish the configuration, click the FINISH button and you will return to the Policy Profile Edit window.
    8. To save the policy, click Save.
    Note: You can also use the following command to create a copy of the sample $INSTDIR/DC/samples/batch/collection-config.json file and put it in a directory for use. $INSTDIR represents the directory where the Data Collector is installed.
    cp -R $INSTDIR/DC/samples/batch <destination directory>
    Sample configuration:
    {
        "lpars": [
            {
                "name": "lpar-name",
                "bootstrapServers": "localhost:9092",
                "smf": {
                    "topicName": "IBM-CDP-zOS-SMF-{dataSourceType}"
                }
            }
        ]
    }
  2. Optional: Copy and customize the application.properties file if you have the following requirements:
    1. To enable Transport Layer Security (TLS) communications with the Apache Kafka server, update the TLS related parameters that are described in Updating the configuration file of the Data Collector.
    2. Specify other parameters as needed. For detailed instructions, see The application.properties file.
    The sample application.properties file for batch mode is located in $INSTDIR/DC/samples/batch, of which, $INSTDIR is the directory where the Data Collector is installed. If you need to configure the application.properties file, copy and place it in the same directory of <policy>.collection-config.json file.
  3. Create and customize the job HBODCBAT.
    1. From data set hlq.SHBOSAMP, copy job HBODCBAT to your JCL library.
    2. In the copied HBODCBAT, set the following environment variables for your environment:
      JAVAHOME
      Specify the Java™ installation directory.
      CDPINST
      Use this parameter to specify the location where the Data Collector is installed. Replace /usr/lpp/IBM/zcdp/v5r1m0 with the directory where the Data Collector is installed.
      CDPWORK
      Use this parameter to specify the location of the policy file <policy>.collection-config.json, which is generated through the Configuration Tool. If you also configured the application.properties file, it should be placed in the same directory of <policy>.collection-config.json file.
      POLICY
      Use this parameter to specify the prefix of the <policy>.collection-config.json file. Replace <policy> with the prefix you would like to use.
      RESTYPE
      Use this parameter to specify which data type to be collected. The allowed values are SYSLOG and SMF. In this case, the value should be set to SMF.
      Sample configuration:
      //  SET JAVAHOME='/usr/lpp/java/J8.0_64'
      //  SET CDPINST='/usr/lpp/IBM/zcdp/v5r1m0'
      //  SET CDPWORK='/u/kafka/test/samples'
      //  SET POLICY='policy'
      //  SET RESTYPE='SMF'
      The parameters must be set correctly, otherwise the Data Collector will be stopped abnormally.
    3. Comment out the <DD Name> DD statement and the HBOIN statement.
      //* HBOIN    DD  *
      //* PLEXNAME:<Plex Name>, DDNAME:<DD Name>
      //* <DD Name> DD   DISP=SHR,DSN=<Syslog DateSet Name>
    4. Customize SMFIN statement by replacing <SMF DateSet Name> with the data set name from which you want to collect logs in the batch processor.