Molecule is the official testing framework for Ansible roles. IBM Cloud DevSecOps provides toolchains that align with the requirements of the Financial Services Industry.

Follow this link to deploy a secure application with DevSecOps best practices.

In this blog post, I am going to discuss how to set up molecule tests and execute them as part of both the Pull Request (PR) and Continuous Integration (CI) pipeline.


Provision a bare metal server using IBM Cloud. If you already have existing infrastructure, you can use that too, you’ll just need to ensure there’s access to this machine from IBM Cloud. You can set up a user with minimal privileges on this machine to run the tests. The credentials for this user will be used to drive the tests.

Since the toolchains come with an integrated Tekton pipeline to automate continuous build, test and deployment of applications, you can script a simple shell script to be executed from inside a container. The following script should serve as an example. The driver for molecule is set as vagrant and the provider is VirtualBox.

The secure parameters are read from the environment variables configured in the pipeline:

1.	#!/usr/bin/env bash
3.	apt-get update
4.	apt-get install -y sshpass
6.	# Set the following environmental properties in BOTH ci-pr-pipline and ci-pipeline.
7.	#   login_username
8.	#   testing_server_ip
9.	#   login_user_pwd (secure value)
11.	login_username=$(cat /config/login_username)
12.	testing_server_ip=$(cat /config/testing_server_ip)
14.	repo_org=$APP_REPO_ORG
15.	repo_name=$APP_REPO_NAME
16.	branch=$BRANCH
17.	pipeline_id=$PIPELINE_RUN_ID
19.	working_directory="$repo_org-$repo_name-$branch-$pipeline_id"
21.	sshpass -p $(cat /config/login_user_pwd) ssh -o StrictHostKeyChecking=no \
22.	    $login_username@$testing_server_ip \
23.	    "rm -rf ~/$working_directory; \
24.	    mkdir ~/$working_directory; \
25.	    cd ~/$working_directory; \
26.	    git clone$repo_org/$repo_name.git;cd ~/$working_directory/$repo_name; \
27.	    git checkout $branch; \
28.	    bash -s < scripts/"
30.	exit_status=$?
31.	echo $exit_status
33.	sshpass -p $(cat /config/login_user_pwd) ssh -o StrictHostKeyChecking=no \
34.	    $login_username@$testing_server_ip \
35.	    "rm -rf ~/$working_directory"
37.	exit $exit_status

Because the script gets executed inside a container, all the transactions that transpire are ephemeral.

You can automate the entire molecule testing phases as follows:

1.	echo "Cleaning up existing VMs and cache"
2.	sh scripts/ kvm
4.	python3 -m venv venv
5.	source venv/bin/activate
7.	pip install --upgrade pip
9.	pip install -r ./requirements.txt
11.	molecule lint -s kvm
12.	exit_status=$?
13.	if [ $exit_status -ne 0 ]; then
14.	  molecule destroy -s kvm
15.	  echo $exit_status
16.	  exit $exit_status
17.	fi
19.	molecule converge -s kvm
20.	exit_status=$?
21.	if [ $exit_status -ne 0 ]; then
22.	  molecule destroy -s kvm
23.	  echo $exit_status
24.	  exit $exit_status
25.	fi
27.	molecule verify -s kvm
28.	exit_status=$?
29.	if [ $exit_status -ne 0 ]; then
30.	  molecule destroy -s kvm
31.	  echo $exit_status
32.	  exit $exit_status
33.	fi
35.	molecule idempotence -s kvm
36.	exit_status=$?
37.	if [ $exit_status -ne 0 ]; then
38.	  molecule destroy -s kvm
39.	  echo $exit_status
40.	  exit $exit_status
41.	fi
44.	deactivate

You can also write an optional script to clean up the vagrant machines:

1.	#!/bin/bash
3.	echo $1
4.	vagrant global-status --prune | grep $1 | awk '{print $1}' | while read -r line; do vagrant destroy -f $line; done
6.	echo "Cleaning cache"
7.	rm -rf ~/.cache/molecule/$1/
9.	echo "About to stop the VMs on the Bare Metals"
10.	for vm in $(VBoxManage list vms | grep $1 | awk '{print substr($2, 2, length($2) - 2)}')
11.	do
12.	echo "Powering off VM ${vm}"
13.	VBoxManage controlvm ${vm} poweroff
14.	VBoxManage unregistervm ${vm}
15.	echo "VM ${vm} powered off"
16.	done
18.	echo "Cleaning VMs"
19.	ls -lrt ~/VirtualBox\ VMs/ | grep $1 | awk '{print $9}' | while read -r line; do echo $line;rm -rf ~/VirtualBox\ VMs/$line; done


In this blog post, we went over the process of setting up a repeatable process to test the molecule tests that can validate the Ansible playbooks for a given role. The pipelines get executed anytime a pull request is opened against a specific branch and when the code is merged into the base branch.

Learn more about IBM Cloud DevSecOps.


More from Cloud

Kubernetes version 1.28 now available in IBM Cloud Kubernetes Service

2 min read - We are excited to announce the availability of Kubernetes version 1.28 for your clusters that are running in IBM Cloud Kubernetes Service. This is our 23rd release of Kubernetes. With our Kubernetes service, you can easily upgrade your clusters without the need for deep Kubernetes knowledge. When you deploy new clusters, the default Kubernetes version remains 1.27 (soon to be 1.28); you can also choose to immediately deploy version 1.28. Learn more about deploying clusters here. Kubernetes version 1.28 In…

Temenos brings innovative payments capabilities to IBM Cloud to help banks transform

3 min read - The payments ecosystem is at an inflection point for transformation, and we believe now is the time for change. As banks look to modernize their payments journeys, Temenos Payments Hub has become the first dedicated payments solution to deliver innovative payments capabilities on the IBM Cloud for Financial Services®—an industry-specific platform designed to accelerate financial institutions' digital transformations with security at the forefront. This is the latest initiative in our long history together helping clients transform. With the Temenos Payments…

Foundational models at the edge

7 min read - Foundational models (FMs) are marking the beginning of a new era in machine learning (ML) and artificial intelligence (AI), which is leading to faster development of AI that can be adapted to a wide range of downstream tasks and fine-tuned for an array of applications.  With the increasing importance of processing data where work is being performed, serving AI models at the enterprise edge enables near-real-time predictions, while abiding by data sovereignty and privacy requirements. By combining the IBM watsonx data…

The next wave of payments modernization: Minimizing complexity to elevate customer experience

3 min read - The payments ecosystem is at an inflection point for transformation, especially as we see the rise of disruptive digital entrants who are introducing new payment methods, such as cryptocurrency and central bank digital currencies (CDBC). With more choices for customers, capturing share of wallet is becoming more competitive for traditional banks. This is just one of many examples that show how the payments space has evolved. At the same time, we are increasingly seeing regulators more closely monitor the industry’s…