Configuring the Data Collector to collect SMF data in batch mode
This topic provides step by step instructions on how to configure the Data Collector to collect System Management Facilities (SMF) data in batch mode.
About this task
- Configuration file:
- <policy>.collection-config.json, which is generated through the Configuration Tool
- (optional) application.properties
- Batch JCL:
HBODCBAT
Procedure
- Generate the <policy>.collection-config.json
file through the Configuration Tool. For more detailed instructions, see Creating a policy for streaming data to Apache Kafka through Data Collector.
- From the Z Common Data Provider Configuration Tool, click the Create a policy box under section Policy for streaming data to Apache Kafka through Data Collector.
- In the resulting Policy Profile Edit window, type or update the required policy name and, optionally, a policy description.
- Click the Configure data resources box to open the Configure Data Resources window.
- Click the ADD SYSTEM button to open the 'Add
system' dialogue. Specify System name and Bootstrap
servers, and then click OK.
- System name
- The name of the system in the sysplex.
- Bootstrap servers
- A comma-delimited list of host:port pairs to use for establishing the initial connections to the Kafka cluster.
- In the system section, click the Add data resource icon to add
a data resource.
- Select SMF from the drop-down list of Data source.
- Specify other parameters as needed.
- Click OK to save your settings.
- Click the
edit icon at the end of the resource to edit topic name for SMF data.
In the resultingConfigure topic name and topic name group
window, define the topic name for SMF data. The default value is IBM-CDP-zOS-SMF-{plexName}-{sourceType}, of which, plexName will be parsed to sysplex name, and sourceType will be parsed to SMF integer type. The topic name applies to all SMF data resources in the same system. - When you finish the configuration, click the FINISH button and you will return to the Policy Profile Edit window.
- To save the policy, click Save.
Note: You can also use the following command to create a copy of the sample $INSTDIR/DC/samples/batch/collection-config.json file and put it in a directory for use. $INSTDIR represents the directory where the Data Collector is installed.
Sample configuration:cp -R $INSTDIR/DC/samples/batch <destination directory>
{ "lpars": [ { "name": "lpar-name", "bootstrapServers": "localhost:9092", "smf": { "topicName": "IBM-CDP-zOS-SMF-{dataSourceType}" } } ] }
- Optional: Copy and customize the application.properties
file if you have the following requirements:
- To enable Transport Layer Security (TLS) communications with the Apache Kafka server, update the TLS related parameters that are described in Updating the configuration file of the Data Collector.
- Specify other parameters as needed. For detailed instructions, see The application.properties file.
The sample application.properties file for batch mode is located in $INSTDIR/DC/samples/batch, of which, $INSTDIR is the directory where the Data Collector is installed. If you need to configure the application.properties file, copy and place it in the same directory of <policy>.collection-config.json file. -
Create and customize the job
HBODCBAT
.- From data set
hlq.SHBOSAMP
, copy jobHBODCBAT
to your JCL library. - In the copied
HBODCBAT
, set the following environment variables for your environment:- JAVAHOME
- Specify the Java™ installation directory.
- CDPINST
- Use this parameter to specify the location where the Data Collector is installed. Replace /usr/lpp/IBM/zcdp/v5r1m0 with the directory where the Data Collector is installed.
- CDPWORK
- Use this parameter to specify the location of the policy file <policy>.collection-config.json, which is generated through the Configuration Tool. If you also configured the application.properties file, it should be placed in the same directory of <policy>.collection-config.json file.
- POLICY
- Use this parameter to specify the prefix of the <policy>.collection-config.json file. Replace <policy> with the prefix you would like to use.
- RESTYPE
- Use this parameter to specify which data type to be collected. The allowed values are SYSLOG and SMF. In this case, the value should be set to SMF.
- Sample configuration:
The parameters must be set correctly, otherwise the Data Collector will be stopped abnormally.// SET JAVAHOME='/usr/lpp/java/J8.0_64' // SET CDPINST='/usr/lpp/IBM/zcdp/v5r1m0' // SET CDPWORK='/u/kafka/test/samples' // SET POLICY='policy' // SET RESTYPE='SMF'
- Comment out the
<DD Name>
DD statement and theHBOIN
statement.//* HBOIN DD * //* PLEXNAME:<Plex Name>, DDNAME:<DD Name> //* <DD Name> DD DISP=SHR,DSN=<Syslog DateSet Name>
- Customize
SMFIN
statement by replacing <SMF DateSet Name> with the data set name from which you want to collect logs in the batch processor.
- From data set