Static deployment with the Ansible translator
In this scenario, you deploy a package file that contains all the application artifacts to be deployed. You use the Wazi Deploy Ansible® translator for the deployment. The whole process takes place where the Ansible controller is installed.
Requirements
Make sure that all the installation requirements are met.
Deploy an application package
This scenario is adapted to a CICS® Db2® application that contains CICS Db2 load modules and database request modules (DBRM) to deploy. The deployment steps are implemented by the Wazi Deploy building blocks. For the complete list of these building blocks, see The building blocks for the Ansible and Python translators.
- Create a package.v1.tar package file that contains CICS
Db2 load modules and DBRMs. For more information, see The package file.
To be correctly processed by Wazi Deploy, all the artifacts must have an extension in the .tar file.
To create this .tar file, you can run The Wazi Deploy packager command (with theuploadType
argument set toarchive
) or build the file with your enterprise build framework. The package typically contains only what was built during the build process.Note: You can build the .tar file from an IBM® Dependency Based Build (DBB) build with thewazideploy-package
command. To do so, complete the following steps:- Create a dbb_prepare_local_folder.py Python script with the following
code:
#******************************************************************************* # Licensed Materials - Property of IBM # (c) Copyright IBM Corp. 2025. All Rights Reserved. # # Note to U.S. Government Users Restricted Rights: # Use, duplication or disclosure restricted by GSA ADP Schedule # Contract with IBM Corp. #******************************************************************************* import argparse import sys import os import platform import re import json import subprocess from pathlib import Path from tempfile import NamedTemporaryFile class DBBUtilities(object): @staticmethod def filter_deployable_records(record) -> bool: try: if (record['type'] == 'EXECUTE' or record['type'] == 'COPY_TO_PDS') and len(record['outputs']) > 0: for output in record['outputs']: try: if output['deployType']: return True except: pass except: pass return False @staticmethod def filter_deleted_records(record) -> bool: try: if record.get('deletedBuildOutputs'): return True except: pass return False @staticmethod def read_build_result(read_build_result_file) -> dict: with open(read_build_result_file) as read_file: return dict(json.load(read_file)) @staticmethod def get_copy_mode(deployType:str = "LOAD", **kwargs) -> str: if kwargs.get('copyModeProperties') is not None: props = {} props_yaml_file = kwargs['copyModeProperties'] try: with open(props_yaml_file, 'r') as stream: props = dict (yaml.safe_load(stream)) if props.get(deployType) is not None: return props.get(deployType) except IOError as error: print(error, file=sys.stderr) raise RuntimeError(f"!!! Couldn't open target environment file from: {props_yaml_file} !!!") if re.search('LOAD', deployType, re.IGNORECASE): return "LOAD" elif re.search('DBRM', deployType, re.IGNORECASE): return "BINARY" elif re.search('PSB', deployType, re.IGNORECASE): return "LOAD" elif re.search('DBD', deployType, re.IGNORECASE): return "LOAD" elif re.search('TEXT', deployType, re.IGNORECASE): return "TEXT" elif re.search('COPY', deployType, re.IGNORECASE): return "TEXT" elif re.search('OBJ', deployType, re.IGNORECASE): return "BINARY" elif re.search('DDL', deployType, re.IGNORECASE): return "TEXT" elif re.search('JCL', deployType, re.IGNORECASE): return "TEXT" elif re.search('CEXEC', deployType, re.IGNORECASE): return "BINARY" elif re.search('LANGX', deployType, re.IGNORECASE): return "LANGX" else: return "TEXT" def run_command (args: list, verbose: bool = True, shell: bool = False): process = subprocess.run(args, capture_output=True, text=True, shell=shell) if process.returncode != 0 and verbose: print("stdout:", process.stdout, file=sys.stdout) print("stderr:", process.stderr, file=sys.stderr) return process.returncode, process.stdout, process.stderr def copy_dbb_build_result_to_local_folder(**kwargs): dbb_build_result_file = kwargs['dbbBuildResult'] working_folder = kwargs['workingFolder'] # Units buildResult = DBBUtilities.read_build_result(dbb_build_result_file) records = list(filter(lambda record: DBBUtilities().filter_deployable_records(record),buildResult['records'])) for record in records: for output in record['outputs']: if 'deployType' in output: dataset = output['dataset'] deploy_type = output['deployType'] parts = re.split('\\(|\\)',dataset) if len(parts) < 2 : print(f"**! WARNING: {parts[0]} has no members!!!") continue member_name = parts[1] pds_name = parts[0] # Build the local_folder from DBB Build Outputs os.makedirs(f"{working_folder}/{pds_name}", exist_ok=True) copyMode = DBBUtilities.get_copy_mode(deploy_type, **kwargs) msgstr = f"** Copy //'{dataset}' to {working_folder}/{pds_name}/{member_name}.{deploy_type} ({copyMode})" print(msgstr) if copyMode == 'LANGX': f = NamedTemporaryFile(delete=False, encoding='cp1047', mode='w+t', suffix='.rexx') f.write(f"""/* rexx */ ADDRESS TSO "ALLOC FI(SYSPRINT) DUMMY REUSE" "ALLOC FI(SYSPRINT) DUMMY REUSE" "ALLOC FI(SYSIN) DUMMY REUSE" "ALLOC FI(SYSUT1) DATASET('{dataset}') SHR REUSE" "ALLOC FI(SYSUT2) PATH('{f.name}.{deploy_type}') PATHOPTS(ORDWR,OCREAT) PATHDISP(KEEP,DELETE) PATHMODE(SIRUSR,SIWUSR,SIRGRP,SIROTH) FILEDATA(RECORD) RECFM(V B) LRECL(1562) BLKSIZE(32760)" "IEBGENER" RC2 = RC "FREE FI(SYSUT1)" "FREE FI(SYSUT2)" "FREE FI(SYSPRINT)" "FREE FI(SYSIN)" IF RC2 ^= 0 THEN DO SAY "IEBGENER FAILED." END RETURN RC2 """) f.flush() os.chmod(f.name, 0o755) if platform.system() == 'OS/390': args = f"sh -c {f.name}" rc, out, err = run_command(args, shell=True) if rc != 0: msgstr = f"*! Error executing command: {args} out: {out} error: {err}" print(msgstr) sys.exit(-1) f.close() args = ["mv", f"{f.name}.{deploy_type}", f"{working_folder}/{pds_name}/{member_name}.{deploy_type}"] elif copyMode == 'LOAD': args = ["cp", "-XI", f"//'{dataset}'", f"{working_folder}/{pds_name}/{member_name}.{deploy_type}"] elif copyMode == 'BINARY': args = ["cp", "-F", "bin", f"//'{dataset}'", f"{working_folder}/{pds_name}/{member_name}.{deploy_type}"] else: args = ["cp", f"//'{dataset}'", f"{working_folder}/{pds_name}/{member_name}.{deploy_type}"] if platform.system() == 'OS/390': rc, out, err = run_command(args) if rc != 0: msgstr = f"*! Error executing command: {args} out: {out} error: {err}" print(msgstr) sys.exit(-1) if copyMode == 'TEXT' or copyMode == 'LANGX': args = ["chtag", "-r", f"{working_folder}/{pds_name}/{member_name}.{deploy_type}"] else: args = ["chtag", "-b", f"{working_folder}/{pds_name}/{member_name}.{deploy_type}"] rc, out, err = run_command(args) if rc != 0: msgstr = f"*! Error executing command: {cmd} out: {out} error: {err}" print(msgstr) sys.exit(-1) def main(): parser = argparse.ArgumentParser(description="DBB Prepare Package") parser.add_argument('-br', '--dbbBuildResult', required=True, help='The DBB build result file') parser.add_argument('-wf', '--workingFolder', required=True, help='The path to the working folder') parser.add_argument('-cp', '--copyModeProperties', help='The path to the file that contains copy mode properties') if len(sys.argv[1:]) == 0: parser.print_help() return args = parser.parse_args() kwargs=vars(args) copy_dbb_build_result_to_local_folder (**kwargs) if __name__ == '__main__': main()
- Run the following command to prepare the z/OS®
UNIX System Services local
folder:
python3 dbb_prepare_local_folder.py --dbbBuildResult BuildReport.json --workingFolder ./package
- Run the
wazideploy-package
command with the following code:wazideploy-package\ --manifestName genapp\ --manifestVersion 1.0.0\ --manifest ./package/wazideploy_manifest.yml\ --buildUrl http://example.com\ --buildName genapp_build\ --localFolder ./package\ --configFile config.yml\ --repository wazi_deploy_repo\ --repositoryPath genapp/dev\ --uploadType archive
To create a package .tar file, you can also use IBM Dependency Based Build (DBB) and zAppBuild as the build manager. In this case, you run the
PackageBuildOutputs
Groovy script and provide the--addExtension
(-ae
) argument. - Create a dbb_prepare_local_folder.py Python script with the following
code:
- Create a static-deployment-method.yml file, which
constitutes the deployment method. This deployment
method is simple as it contains only the following four activities:
- The
PACKAGE
activity that handles the application package. - The
DEPLOY_MODULES
activity that copies the artifacts from the package to the target PDS in your z/OS environment. - The
DB2
activity for theBIND PLAN
andBIND PACKAGE
steps of the deployment. - The
CICS
activity for theCMCI NEWCOPY
step of the deployment.
Specify the deployment method contents by copying and pasting the following lines:
--- apiVersion: deploy.ibm.com/v1 kind: DeploymentMethod metadata: name: "CICS" version: "1.0.0" description: | This deployment method can be used for the deployment of a CICS application. activities: - name: PACKAGE description: | This activity handles the package actions: - name: PACKAGE steps: - name: PACKAGE tags: - always - name: DEPLOY_MODULES description: | This activity is dedicated to the deployment of the artifacts into the PDS actions: - name: UPDATE description: | This action is applicable when artifacts are updated states: - UNDEFINED steps: - name: MEMBER_COPY description: | This step copies artifacts into PDSs types: - name: 'DBRM' - name: 'LOAD' - name: 'CICSLOAD' - name: 'MAPLOAD' - name: 'JCL' is_artifact: True - name: DB2 description: | This activity is dedicated to DBRM bind modules actions: - name: UPDATE description: | This action is applicable when DBRM modules are updated states: - UNDEFINED steps: - name: DB2_BIND_PACKAGE - name: DB2_BIND_PLAN types: - name: 'DBRM' is_artifact: True - name: CICS description: | This activity is dedicated to CICS load modules actions: - name: UPDATE states: - UNDEFINED steps: - name: PROG_UPDATE properties: - key: "template" value: "cics_cmci_prog_update" types: - name: 'CICSLOAD' - name: 'MAPLOAD' is_artifact: True
- The
- Generate the deployment plan
by running the
wazideploy-generate
command with the deployment method and the archive file as input.Enter the following command:wazideploy-generate -dm static-deployment-method.yml -dp deployment_plan.yml -pif package.v1.tar -dpn "CICS" -dpv "1.0.0" -dpd "CICS App"
At this stage, the
deployment_plan.yml
deployment plan and the deployment plan report (DeploymentPlanReport.html by default) are generated. - Use the Wazi Deploy
Ansible translator to deploy to your target z/OS environment. This deployment involves the following steps:
- Create an Ansible inventory by
following the Ansible official documentation at How to build your inventory.For example, you can create an inventories/inventory.yml folder/file structure and specify lines similar to the following sample lines in inventories/inventory.yml:
all: hosts: zos_env_test: ansible_host: zos.dev ansible_user: IBMUSER ansible_ssh_port: 22
- Provide an Ansible environment file to
describe the target environment where the artifacts are to be deployed.
You can use the following sample to create your
all.yml
environment file in the inventories/group_vars folder of your Ansible inventory. Adapt the values to your environment:################################################################################ # Enviroment variables for all z/OS managed nodes (target) that do not need # further configuration. ################################################################################ environment_vars: _BPXK_AUTOCVT: "ON" ZOAU_HOME: "{{ ZOAU }}" PYTHONSTDINENCODING: "cp1047" LIBPATH: "{{ ZOAU }}/lib:{{ PYZ }}/lib:/lib:/usr/lib:/lib" PATH: "{{ ZOAU }}/bin:{{ PYZ }}/bin:/bin:/var/bin" _CEE_RUNOPTS: "FILETAG(AUTOCVT,AUTOTAG) POSIX(ON)" _TAG_REDIR_ERR: "txt" _TAG_REDIR_IN: "txt" _TAG_REDIR_OUT: "txt" LANG: "C" ################################################################################ # Description of the properties used in this configuration: # - Property `PYZ` is the python installation home path on the z/OS managed node (target), # e.g, pyz: "/usr/lpp/IBM/cyp/v3r11/pyz" # - Property `ZOAU` is the ZOAU installation home on the z/OS managed node (target), # e.g, zoau: "/usr/lpp/IBM/zoautil" # - property `ansible_python_interpreter` is the z/OS managed node (target) Python # binary installation path, e.g, ansible_python_interpreter: "{{PYZ}}/bin/python3.8" # # Note, PYZ and ZOAU environment variables must be configured. ################################################################################ PYZ: "/usr/lpp/IBM/cyp/v3r11/pyz" ZOAU: "/usr/lpp/IBM/zoautil" ################################################################################ # Do not configure, variable substituion will correctly set the # variable`ansible_python_interpreter` ################################################################################ ansible_python_interpreter: "{{ PYZ }}/bin/python3" ################################################################################ # Description of the properties used in this configuration: ################################################################################ # - Property BPXK_AUTOCVT must be configured to "ON"; e.g., _BPXK_AUTOCVT: "ON" # - Property ZOAU_HOME is the ZOA Utilities install root path; e.g., ZOAU_HOME: "/usr/lpp/IBM/zoautil" # - Property PYTHONPATH is the ZOA Utilities Python library path; e.g., PYTHONPATH: "/usr/lpp/IBM/zoautil/lib" # - Property LIBPATH is both the path to the Python libraries on the target # and the ZOA Utilities Python library path separated by semi-colons; e.g., # LIBPATH: "/usr/lpp/IBM/zoautil/lib/:/usr/lpp/IBM/cyp/v3r11/pyz/lib:/usr/lib:/lib:." # - Property PATH is the ZOA utilities BIN path and Python interpreter path, e.g., # PATH: "/usr/lpp/IBM/zoautil/bin:/usr/lpp/IBM/cyp/v3r11/pyz/bin:/bin" # - Property _CEE_RUNOPTS is the invocation Language Environment runtime # options for programs and used by Python. e.g., # _CEE_RUNOPTS: "FILETAG(AUTOCVT,AUTOTAG) POSIX(ON)" # - Properties __TAG_REDIR_ERR, _TAG_REDIR_IN, _TAG_REDIR_OUT are txt and used # by the shell; e.g., # _TAG_REDIR_ERR: "txt" # _TAG_REDIR_IN: "txt" # _TAG_REDIR_OUT: "txt" # - Property LANG is the name of the default locale; value # C specifies the POSIX locale; for example: ``LANG: "C"``. ################################################################################ # Deploy specific variables common_pds_load_spec: type: LIBRARY space_primary: 10 space_secondary: 20 space_type: CYL record_format: U record_length: 0 common_pds_binary_spec: type: LIBRARY space_primary: 10 space_secondary: 20 space_type: CYL record_format: FB record_length: 80 hlq: 'NAZARE.WDEPLOY.ANSIBLE' uss_root_folder: '/tmp/uss_root' default_package_jobcard: "BINDPKG JOB 'WD-PKGBIND',MSGLEVEL=(1,1),MSGCLASS=R,NOTIFY=&SYSUID" default_plan_jobcard: "BINDPLA JOB 'WD-PLANBIND',MSGLEVEL=(1,1),MSGCLASS=R,NOTIFY=&SYSUID" default_db2_user: "{{ansible_user}}" default_db2_action: "REPLACE" db2_sys_def: sdsnload: DSN.V12R1M0.SDSNLOAD subsys: DBC1 qualifier: GENADB0 package: GENASA1 plan: GENAONE sqlid: "{{default_db2_user}}" package_jobcard: "{{default_package_jobcard}}" plan_jobcard: "{{default_plan_jobcard}}" package_action: "{{default_db2_action}}" plan_action: "{{default_db2_action}}" package_job_max_rc: 4 plan_job_max_rc: 4 plan_pklist: '*.GENASA1.*' default_cics_cmci_action: 'PHASEIN' cics_sys_def: cmci_host: "{{ ansible_host }}" cmci_port: 1490 # cmci_cert: # cmci_key: cmci_user: "{{ansible_user}}" cmci_password: "{{ansible_password if ansible_password is defined else None}}" context: CICS01 scheme: http # scope: insecure: true csd_group: GENASAP # types to manage default_types: 'types' types: - type: 'load' copy_by_folder: '{{default_copy_by_folder}}' pds: name: '{{ hlq }}.LOADLIB' spec: "{{common_pds_load_spec}}" backup: '{{ hlq }}.BACK.LOADLIB' is_load: True aliases: True force_lock: True - type: 'dbrm' copy_by_folder: '{{default_copy_by_folder}}' pds: name: '{{ hlq}}.DBRM' spec: "{{common_pds_binary_spec}}" backup: '{{ hlq }}.BACK.DBRM' is_binary: True force_lock: True db2_systems: - "{{ db2_sys_def }}" - type: 'cicsload' copy_by_folder: '{{default_copy_by_folder}}' pds: name: '{{ hlq }}.LOADLIB' spec: "{{common_pds_load_spec}}" backup: '{{ hlq }}.BACK.LOADLIB' is_load: True aliases: True force_lock: True cics_systems: - "{{ cics_sys_def }}" - type: 'mapload' copy_by_folder: '{{default_copy_by_folder}}' pds: name: '{{ hlq}}.LOADLIB' spec: "{{common_pds_load_spec}}" backup: '{{ hlq }}.BACK.LOADLIB' is_load: True aliases: True force_lock: True cics_systems: - "{{ cics_sys_def }}" # Default folders and logging values wd_env: uss_dir: "/tmp/wd_deploy/{{ansible_user|lower}}/{{inventory_hostname}}" local_dir: "/tmp/wd_deploy/{{lookup('env','USER')|lower}}/{{inventory_hostname}}" #wd_log: False
- Create two Ansible files for the deployment.
- An Ansible playbook
(deploy.yml for example) must call the Wazi Deploy role in the following
way:
- hosts: "{{ wd_hosts if wd_hosts is defined else 'all' }}" gather_facts: "{{ wd_gather_facts if wd_gather_facts is defined else 'no' }}" serial: "{{ wd_serial if wd_serial is defined else 5 }}" tasks: - import_role: name: ibm.ibm_zos_wazi_deploy.zos_deploy
- An ansible.cfg file must include the following
contents:
##### [defaults] forks = 25 host_key_checking = False #callbacks_enabled = ibm.ibm_zos_wazi_deploy.cb_aggregate_evidences stdout_callback = ibm.ibm_zos_wazi_deploy.cb_evidences [ssh_connection] pipelining = True #ssh_args = -o ControlMaster=auto -o ControlPersist=3600s -o PreferredAuthentications=publickey ####
- An Ansible playbook
(deploy.yml for example) must call the Wazi Deploy role in the following
way:
- Run the following command in Ansible
to deploy the artifacts to z/OS:
ansible-playbook deploy.yml\ -i inventories\ -e wd_deployment_plan_file=deployment_plan.yml\ -e wd_package_file=package.v1.tar
The successful completion of the deployment produces the following results:- The artifacts are deployed to the PDS that was specified in the
all.yml
environment file. - The
DB2 BIND
andCICS
activities are run. - An evidence file is generated in the evidences folder with the following
naming convention: Example: If the current Ansible inventory host is
zos_env_test
and if the deployment completed on 10 May 2023 at 12h45m30s, the complete path of the evidence file is ./evidences/zos_env_test_evidences_20230510_124530.yml.With this evidence file, you can analyze the Wazi Deploy deployment process and the content of the target deployment environment. For more information, see Getting started with the analysis of the deployment results.
- Create an Ansible inventory by
following the Ansible official documentation at How to build your inventory.
Deploy a new version of the application package with a backup step
Now you can deploy an updated version of the application, the
package.v2.tar
package file, which contains a new version of the artifacts to
update.
- Add a step to your static-deployment-method.yml
deployment method to back up the already deployed artifacts in a PDS. To do so, add a
MEMBER_ARCHIVE
step before theMEMBER_COPY
step in static-deployment-method.yml.- name: MEMBER_ARCHIVE description: | This step backs up already installed artifacts into the backup PDS.
- Generate the new deployment plan by running the
wazideploy-generate
command:wazideploy-generate -dm static-deployment-method.yml -dp deployment_plan.yml -pif package.v2.tar -dpn "CICS" -dpv "2.0.0" -dpd "CICS App"
- Trigger the new deployment by running the following command in Ansible:
ansible-playbook deploy.yml\ -i inventories\ -e wd_deployment_plan_file=deployment_plan.yml\ -e wd_package_file=package.v2.tar
At this stage, a new version of the updated artifacts is deployed and a backup of the previous version is now available in the backup PDS.
Roll back the latest deployment
You can revert to the previous version of the application in the target z/OS environment by adding a MEMBER_RESTORE
step
and by using the plan_tags
element to build the restore flow process.
- Add a step to your static-deployment-method.yml
deployment method to restore the artifacts that are stored in the backup PDS. To do so, add a
MEMBER_RESTORE
step before theMEMBER_COPY
step in static-deployment-method.yml with the followingplan_tags
:- name: MEMBER_RESTORE description: | This step restores the artifacts in PDSs plan_tags: - restore - never
When the
plan_tags
is set tonever
, the step is never run at deployment time unless you force running this step. - Add a
restore
plan_tags
to the DB2® and CICS activities:plan_tags: - restore
- Generate the new deployment plan by running the
wazideploy-generate
command:wazideploy-generate -dm static-deployment-method.yml -dp deployment_plan.yml -pif package.v2.tar -dpn "CICS" -dpv "1.0.0" -dpd "CICS App"
- Trigger the new deployment by running the following command in Ansible:
ansible-playbook deploy.yml\ -i inventories\ -e wd_deployment_plan_file=deployment_plan.yml\ -e wd_package_file=package.v2.tar -e planTags=restore
At this stage, the artifacts have reverted to the latest deployment.
This basic rollback is intended to show the backup and restore capabilities of Wazi Deploy.
Delete an artifact from the application
To delete an artifact, provide an application manifest with artifacts to delete.
- Create a deleted_artifacts.yml manifest with the
following content:
apiVersion: deploy.ibm.com/v1 kind: ManifestState metadata: name: CICS description: "CICS App" version: 3.0.0 deleted_artifacts: - name: LGACDB02 description: LGACDB02.CICSLOAD properties: - key: path value: NAZARE.WDEPLOY.DBBBUILD.GENAPP.LOAD/LGACDB02.CICSLOAD type: CICSLOAD
- At the end of the static-deployment-method.yml
deployment method, add the following
activity:
- name: DELETE_MODULES description: | This activity is dedicated to the deletion of the artifacts from the PDS(E)s actions: - name: DELETE description: | This action is applicable when the artifacts are deleted states: - DELETED steps: - name: MEMBER_ARCHIVE description: | This step backs up already installed artifacts into the backup PDS - name: MEMBER_DELETE description: | This step deletes the artifacts types: - name: 'DBRM' - name: 'LOAD' - name: 'CICSLOAD' - name: 'MAPLOAD' - name: 'JCL' is_artifact: True
- Generate the new deployment plan by running the
wazideploy-generate
command:wazideploy-generate -dm static-deployment-method.yml -dp deployment_plan.yml -m deleted_artifacts.yml
- Trigger the new deployment by running the following command in Ansible:
ansible-playbook deploy.yml\ -i inventories\ -e wd_deployment_plan_file=deployment_plan.yml
At this stage, the artifacts have been deleted.
Generate a report for the generated evidences
For information about how to query the generated evidences, see Getting started with the analysis of the deployment results.
Wazi Deploy comes with predefined queries and renderers that you can use to quickly generate reports in an HTML or a YAML format. For more information, see The predefined queries in the product.
- Index the generated evidences that were recorded during this scenario by running the following command:
wazideploy-evidence -df evidences -if indexes i
- Generate an HTML report by running the following
command:
wazideploy-evidence -q wd-deployments-summary -if indexes -o wd-deployments-summary.html r renderer=wd-deployments-summary-renderer.html artifact_name="*"
At this stage, the wd-deployments-summary.html file is generated. For a
sample output of this file, see wd-deployments-summary-query
sample output
Filter the deployment results
You can add different types of filters to determine what is to be deployed. For more information, see Getting started with conditional deployment.
Extend the Wazi Deploy capabilities by creating your own building blocks
If the standard steps, which are implemented by the standard Wazi Deploy Ansible or Python building blocks, do not suit your needs, you can create your own building blocks. See Creating your own building blocks for the Wazi Deploy translators.