Streaming key performance indicators for CICS Transaction Server for z/OS monitoring
Z Common Data Provider uses
SMF_110_1_KPI
to collect key performance indicators for CICS® Transaction Server for z/OS® monitoring. In addition to
the fields in SMF_110_1_KPI
, you can use DEFINE TEMPLATE
to stream
more data fields.
Before you begin
For more information about the content of SMF_110_1_KPI
data stream, see SMF_110_1_KPI data stream content. For more information about the fields in
SMF_110_1_KPI
, see Table 1.
About this task
You can create DEFINE TEMPLATE
statement to filter more fields of
SMF_110_1_KPI
records and customize the data streams to stream the fields. After
that, create and update the policy in the Configuration Tool to include the data stream.
Procedure
-
If you do not already have one, create a partitioned data set (PDS) that is used as the user
concatenation library for the custom template definition. For more information about how to create the partitioned data set, see step 1.a in Creating a System Data Engine data stream definition.
- Copy the sample update and template definitions from the member
HBOUUKPI
of the SMP/E target data sethlq.SHBODEFS
to the user concatenation library, and edit the definitions based on your requirements.The following example shows how to define an update and a template definition for filtering more fields ofSMF_110_1_KPI
records based on the sample template definition.SET IBM_FILE = 'SMF110xx'; DEFINE UPDATE SMF_110_1_CUST VERSION 'CDP.510' FROM SMF_CICS_T TO &IBM_UPDATE_TARGET AS &IBM_FILE_FORMAT SET(ALL); DEFINE TEMPLATE SMF_110_1_CUST FOR SMF_110_1_CUST ORDER (SMFMNTME, SMFMNDTE, fld1, fld2, ...... fldn) AS &IBM_FILE_FORMAT;
SET
- The
SET
statement is needed only when the target of the data stream is a file, which means the variableIBM_UPDATE_TARGET
is set toFILE &IBM_FILE
. DEFINE UPDATE
- The custom update definition name must be unique among update definitions. For the language
reference of the
DEFINE UPDATE
statement, see DEFINE UPDATE statement.
You can change the value ofDEFINE UPDATE SMF_110_1_CUST
CUST
inSMF_110_1_CUST
according to your needs. DEFINE TEMPLATE
- For filtering more fields of
SMF_110_1_KPI
records, add aDEFINE TEMPLATE
statement for the update definition in the same data set member of that update definition. The template definition name must be the same as the update definition name to replace the default template definition that streams all fields for the update definition.For versions before 4Q2019 PTF, in the template definition, you must include the date
SMFMNDTE
and timeSMFMNTME
fields from the SMF record header ofSMF_CICS_T
. These fields are required for timestamp resolution when you ingest data to your analytics platform. fld1, fld2, fildn
- This section defines the fields in
SMF_110_1_KPI
record. These fields are separated by commas. You can select the fields that are listed in Fields for SMF_110_1_CUST data stream.For the language reference of the
DEFINE TEMPLATE
statement, see DEFINE TEMPLATE statement.
- Validate the syntax of the custom update and template definitions. Use the following example job to verify the member of the custom definitions.
//HBOJBCOL JOB (),'DUMMY',MSGCLASS=X,MSGLEVEL=(,0), // CLASS=A,NOTIFY=&SYSUID //* //HBOSMFCB EXEC PGM=HBOPDE,REGION=0M,PARM='SHOWINPUT=YES' //STEPLIB DD DISP=SHR,DSN=hlq.SHBOLOAD //HBOOUT DD SYSOUT=* //HBODUMP DD SYSOUT=* //HBOIN DD DISP=SHR,DSN=hlq.SHBODEFS(HBOCCSV) // DD DISP=SHR,DSN=hlq.SHBODEFS(HBOCCORY) // DD DISP=SHR,DSN=hlq.SHBODEFS(HBOLLSMF) // DD DISP=SHR,DSN=hlq.SHBODEFS(HBOTCIFI) // DD DISP=SHR,DSN=hlq.SHBODEFS(HBORS110) // DD DISP=SHR,DSN=USERID.LOCAL.DEFS(HBOUUKPI) // DD * COLLECT SMF WITH STATISTICS BUFFER SIZE 1 M; //* //HBOLOG DD DUMMY
- hlq
- Change the hlq to the high-level qualifier for the Z Common Data Provider SMP/E target data set.
// DD DISP=SHR,DSN=USERID.LOCAL.DEFS(HBOUUKPI)
- Specifies the data set member for the custom definitions.
USERID.LOCAL.DEFS
is the user concatenation library.HBOUUKPI
is the member that contains the update and template definitions. Replace the values based on your configuration.
Important: Ensure that the definitions are error-free by running the validation job before you create the custom data stream.Messages are in the output file that is defined byHBOOUT
.If there is no syntax error, you see the following messages.HBO0201I Update SMF_110_1_CUST was successfully defined. HBO0500I Template SMF_110_1_CUST was successfully defined.
If there are syntax errors, correct the errors according to the messages in the output file.
- Validate the data collection with the custom update and template definitions.
Collect data from an SMF data set that contains SMF type 110 subtype 1 records by using a batch System Data Engine job, and validate the data by reviewing the output data set.
Use the following example job to verify the data that is collected with the custom definitions.//HBOJBCOL JOB (),'DUMMY',MSGCLASS=X,MSGLEVEL=(,0), // CLASS=A,NOTIFY=&SYSUID //* //HBOSMFCB EXEC PGM=HBOPDE,REGION=0M,PARM=' ALLHDRS=YES' //STEPLIB DD DISP=SHR,DSN=hlq.SHBOLOAD //HBOOUT DD SYSOUT=* //HBODUMP DD SYSOUT=* //HBOIN DD DISP=SHR,DSN=hlq.SHBODEFS(HBOCCSV) // DD DISP=SHR,DSN=hlq.SHBODEFS(HBOCCORY) // DD DISP=SHR,DSN=hlq.SHBODEFS(HBOLLSMF) // DD DISP=SHR,DSN=hlq.SHBODEFS(HBOTCIFI) // DD DISP=SHR,DSN=hlq.SHBODEFS(HBORS110) // DD DISP=SHR,DSN=USERID.LOCAL.DEFS(HBOUUKPI) // DD * COLLECT SMF WITH STATISTICS BUFFER SIZE 1 M; /* //HBOLOG DD DISP=SHR,DSN=HLQ.LOCAL.SMFLOGS //* //SMF110xx DD DSN=USERID.SMF110xx.CSV, // DISP=(NEW,CATLG,DELETE),SPACE=(CYL,(10,10)), // DCB=(RECFM=V,LRECL=32756)
- hlq
- Change hlq to the high-level qualifier for the Z Common Data Provider SMP/E target data set.
// DD DISP=SHR,DSN= USERID.LOCAL.DEFS(HBOUUKPI)
- Specifies the data set member for the custom definitions.
USERID.LOCAL.DEFS
is the user concatenation library.HBOUUKPI
is the member that contains the update and template definitions. Replace the values based on your configuration. Ensure that the record definition member is included before the update definition member. //HBOLOG DD DSN=
- Specifies the SMF data set that contains your SMF records.
//SMF110xx DD DSN=
- Specifies the data set that stores the output data. Ensure that this value is the same as the
value of the statement
SET IBM_FILE=
in the corresponding update definition. The output data set is a CSV file that you can download and open with spreadsheet applications for validation.
- Create a custom System Data Engine data stream named
SMF_110_1_CUST
in the Configuration tool.For more information about how to create the custom System Data Engine data stream, see Creating a System Data Engine data stream definition.Verify that the data stream name, the custom update definition name, and the custom template definition name are the same.
Fill in the SHBODEFS data set members field as:HBOLLSMF HBORS110 HBOTCIFI HBOUUKPI
- Update your analytics platform so that it can process the new data stream.
- If you are ingesting
SMF_110_1_CUST
data to the Elastic Stack, for each data stream, create a field name annotation configuration file, and a timestamp resolution configuration file in the Logstash configuration directory.- Field name annotation configuration file
- The file is named H_SMF_110_1_CUST.conf. Here is an
example of the file:
Make sure the value of# CDPz ELK Ingestion # # Field Annotation for stream zOS-SMF_110_1_CUST # filter { if [sourceType] == "zOS-SMF_110_1_CUST " { csv{ columns => [ "Correlator", " SMFMNTME", "SMFMNDTE", "fld1”, "fld2", "fldn" ] separator => "," } } }
CUST
inSMF_110_1_CUST
is the same as the value that is specified for the update definition name.sourceType
- The value of
sourceType
must match the data source type of the data stream. The naming convention iszOS-SMF_110_1_CUST
.if [sourceType] == "zOS-SMF_110_1_CUST"
fld1, fld2, and fldn
- Replace
fld1
,fld2
, andfldn
with the fields and order in your custom define template definition. KeepCorrelator
as the first column in the list.
- Timestamp resolution configuration file
- The file is named N_SMF_110_1_CUST.conf. Here is an
example of the file:
Make sure the value of# CDPz ELK Ingestion # # Timestamp Extraction for stream zOS-SMF_110_1_CUST # filter { if [sourceType] == "zOS-SMF_110_1_CUST" { mutate{ add_field => { "[@metadata][timestamp]" => "%{SMFMNDTE} %{SMFMNTME}" }} date{ match => [ "[@metadata][timestamp]", "yyyy-MM-dd HH:mm:ss:SS" ]} } }
CUST
inSMF_110_1_CUST
is the same as the value that is specified for the update definition name.sourceType
- The value of
sourceType
must match the data source type of the data stream. The naming convention iszOS-SMF_110_1_CUST
.if [sourceType] == "zOS-SMF_110_1_CUST"
- If you are ingesting
SMF_110_1_CUST
data to Splunk, define the layout of the data stream to the Splunk server by creating the props.conf file in the Splunk_Home/etc/apps/ibm_zlda_insights/local directory on the Splunk server. If the props.conf file exists, append the following content to the file.
Make sure the value of# # SMF_110_1_CUST # [zOS-SMF_110_1_CUST] TIMESTAMP_FIELDS = SMFMNDTE, SMFMNTME, timezone TIME_FORMAT= %F %H:%M:%S:%2Q %z FIELD_NAMES = "sysplex","system","hostname","","","sourcename","timezone","Correlator","SMFMNTME","SMFMNDTE","fld1”, "fld2", "fldn" INDEXED_EXTRACTIONS = csv KV_MODE = none NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Structured disabled = false pulldown_type = true
CUST
inSMF_110_1_CUST
is the same as the value that is specified for the update definition name.[zOS-SMF_110_1_CUST]
- You must specify the data source name of the data stream. The naming convention is
zOS-SMF_110_1_CUST
. FIELD_NAMES
- Replace
fld1
,fld2
, andfldn
with the fields and order in your custom template definition. If the columnCorrelator
exists, do not remove it.
SMF_110_1_CUST
has the file that is named CDP-zOS-SMF_110_1_CUST-*.cdp.Restart the Splunk server after you make the changes. Refer to Splunk documentation for more information.
- If you are ingesting
- Create or update the policy to add the new System Data Engine data stream
SMF_110_1_CUST.
- In the Configuration Tool primary window, create a new policy or select the policy that you want to update.
- Click the Add Data Stream icon
in the Policy Profile Edit window.
- Find and select the new data stream from the list in the select data stream window.
- Assign a subscriber for each new data stream.
- In the Policy Profile Edit window, click SYSTEM DATA ENGINE to ensure that values are provided for USER Concatenation and CDP Concatenation fields, and click OK. Fill in the field USER Concatenation with the data set name of your user concatenation library.
- Click Save to save the policy.
Important: Each time that the associated update definition or template definition is changed, you must edit and save the policy in the Configuration Tool so that the changes are reflected in the policy.For more information on how to update a policy, see Updating a policy. - Restart the Data streamer and the System Data Engine.