Configuring the Data Collector to collect SYSLOG data in batch mode
This topic provides step by step instructions on how to configure the Data Collector to collect z/OS® SYSLOG data in batch mode.
About this task
- Configuration files:
- <policy>.collection-config.json, which is generated through the Configuration Tool
- (optional) application.properties
- Batch JCL:
HBODCBAT
Procedure
- Generate the <policy>.collection-config.json
file through the Configuration Tool. For more detailed instructions, see Creating a policy for streaming data to Apache Kafka through Data Collector.
- From the Z Common Data Provider Configuration Tool, click the Create a policy box under section Policy for streaming data to Apache Kafka through Data Collector.
- In the resulting Policy Profile Edit window, type or update the required policy name and, optionally, a policy description.
- Click the Configure data resources box to open the Configure Data Resources window.
- Click the ADD SYSTEM button to open the 'Add
system' dialogue. Specify System name and Bootstrap
servers, and then click OK.
- System name
- The name of the system in the sysplex.
- Bootstrap servers
- A comma-delimited list of host:port pairs to use for establishing the initial connections to the Kafka cluster.
- In the system section, click the Add data resource icon to add
a data resource.
- Select LOG from the drop-down list of Data source.
- Specify a value for History topic name, which is the topic name for
history SYSLOG data when the Data Collector is running in batch mode. The default value is
IBM-CDP-zOS-SYSLOG-Console-Historical
. - Specify other parameters as needed.
- Click OK to save your settings.
- When you finish the configuration, click the FINISH button and you will return to the Policy Profile Edit window.
- To save the policy, click Save.
Note: You can also use the following command to create a copy of the sample $INSTDIR/DC/samples/batch/collection-config.json file and put it in a directory for use. $INSTDIR represents the directory where the Data Collector is installed.
Sample configuration:cp -R $INSTDIR/DC/samples/batch <destination directory>
{ "lpars": [ { "name": "lpar-name", "bootstrapServers": "localhost:9092", "log": { "historyTopicName": "IBM-CDP-zOS-SYSLOG-Console-Historical" } } ] }
- Optional: Copy and customize the application.properties
file if you have the following requirements:
- To enable Transport Layer Security (TLS) communications with the Apache Kafka server, update the TLS related parameters that are described in Updating the configuration file of the Data Collector.
- Specify other parameters as needed. For detailed instructions, see The application.properties file.
The sample application.properties file for batch mode is located in $INSTDIR/DC/samples/batch, of which, $INSTDIR is the directory where the Data Collector is installed. If you need to configure the application.properties file, copy and place it in the same directory of <policy>.collection-config.json file. -
Create and customize the job
HBODCBAT
.- From data set
hlq.SHBOSAMP
, copy jobHBODCBAT
to your JCL library. - In the copied
HBODCBAT
, set the following environment variables for your environment:- JAVAHOME
- Specify the Java™ installation directory.
- CDPINST
- Use this parameter to specify the location where the Data Collector is installed. Replace /usr/lpp/IBM/zcdp/v5r1m0 with the directory where the Data Collector is installed.
- CDPWORK
- Use this parameter to specify the location of the policy file <policy>.collection-config.json, which is generated through the Configuration Tool or copied from the sample collection-config.json. If you also configured the application.properties file, it should be placed in the same directory of <policy>.collection-config.json file.
- POLICY
- Use this parameter to specify the prefix of the <policy>.collection-config.json file. Replace <policy> with the prefix you would like to use.
- RESTYPE
- Use this parameter to specify which data type to be collected. The allowed values are SYSLOG and SMF. In this case, the value should be set to SYSLOG.
The parameters must be set correctly, otherwise the Data Collector will be stopped abnormally.// SET JAVAHOME='/usr/lpp/java/J8.0_64' // SET CDPINST='/usr/lpp/IBM/zcdp/v5r1m0' // SET CDPWORK='/u/kafka/test/samples' // SET POLICY='policy' // SET RESTYPE='SYSLOG'
- Comment out the
SMFIN
statement.//* SMFIN DD DISP=SHR,DSN=<SMF DateSet Name>
- Customize the
HBOIN
statement:- Replace <Plex Name> with the value of the plex name where the SYSLOG data sets come from.
- Replace <DD Name> with the DDNAME you want to use in this JCL. You can
collect logs from multiple sysplexes at one time, for
example,
PLEXNAME:TST1,DDNAME:DD1 PLEXNAME:TST2,DDNAME:DD2
- Define the actual DD statements. Replace <Syslog DateSet Name> with the data set name that you want to collect logs from in batch processor. You can collect logs from multiple sources for one
DDNAME
, for example,//DD1 DD DISP=SHR,DSN=ZCDP.SYSLOG1 // DD DISP=SHR,DSN=ZCDP.SYSLOG2 //DD2 DD DISP=SHR,DSN=ZCDP.SYSLOG3
Note: EachDDNAME
specified in step3.d
must be defined as an actual DD statement.
- From data set