To run the Z Common Data Provider System Data Engine in batch mode so that it streams its output to
the data streamer, rather than writing it to a file, you must create the job for loading SMF data in
batch. You can create this job by using the sample job HBOJBCO2 in the
hlq.SHBOSAMP library, and updating the copy.
Procedure
To create the job, complete the following steps:
-
Copy the job
HBOJBCO2 in the hlq.SHBOSAMP
library to a user job library.
-
Update the job card according to your site standards.
-
If affinity with a specific TCP/IP stack is needed, add the name of the stack to the end of the
HBOTCAFF job step like the following example:
//HBOTCAFF EXEC PGM=BPXTCAFF,PARM=TPNAME
-
To enable the zIIP offloading function to run eligible code on zIIPs, specify
ZIIPOFFLOAD=YES in the PARM parameter in the
EXEC statement.
-
Update the following
STEPLIB DD statement to refer to the
hlq.SHBOLOAD data set:
//STEPLIB DD DISP=SHR,DSN=HBOvrm.LOAD
-
Update the port value for IBM_UPDATE_TARGET to specify the TCP/IP port
that is configured for the Data Streamer.
- Optional:
If the Data Streamer is configured to bind to a specific IP address, set the IP address of the
Data Streamer by specifying the following SET statement in the
HBOIN DD statement.
Because the Data Streamer and System Data Engine must be running on the same LPAR, the IP address
must be a valid IP address on the LPAR where the System Data Engine runs.
SET IBM_DS_HOST=ip_address
The following example
assumes the Data Streamer is configured to bind to the IP address of
9.30.243.157.
//HBOIN DD *
SET IBM_DS_HOST = '9.30.243.157';
SET IBM_UPDATE_TARGET = 'PORT ppppp';
SET IBM_FILE_FORMAT = 'CSV';
// DD PATH='/etc/cdpConfig/hboin.sde',
// PATHDISP=(KEEP),RECFM=V,LRECL=255,FILEDATA=RECORD
-
Replace the value
/etc/cdpConfig/hboin.sde with the file path and name of the
policy file that you create in the Configuration Tool.
-
Update the
HBOLOG DD statement to specify the log data sets.