The holiday season accentuates the need for IT agility and flexibility, especially within the retail, hospitality and travel industries. 

However, this need is equally as important for any business that experiences variable demand or events that require a rapid rollout of new services. 

With some basic planning, you can automate the process of secure app and infrastructure provisioning and delivery. This blog post is a quick guide on how you can build a continuous deployment (CD) pipeline process with IBM Cloud Pak for Watson AIOps and how to implement all the IT processes for secure delivery with just a few clicks (without IT involvement).

Traditionally, developers would be required to use service portals like ServiceNow to request virtual machines (VMs) and storage to deploy applications. Today, IT teams are automating this process with Infrastructure as Code and auto-scaling automation in response to these pressures to help accelerate continuous deployment of critical apps and services. This also ensures reliability in the whole stack — from applications to infrastructure. In doing so, businesses can leverage the same workflows used to ensure security and audit requirements, while accelerating the deployment of code and removing risk.

Combining Infrastructure Automation (IA) with an OpenShift GitOps tool based on Argo continuous deployment (CD), users have complete control of both the application and infrastructure stacks, enabling a rapid response to changes during the holiday season.

How this works in action

Assume an OpenShift cluster, where both GitOps and IBM Cloud Pak® for Watson AIOps are deployed. The Infrastructure Automation Operator is activated and ArgoCD is set up.

Infrastructure Automation exposes REST APIs, which can be called from any external tool to invoke Terraform– and Ansible-based services, providing a seamless connection from an ArgoCD application. 

Bring ArgoCD and Infrastructure Automation together to rapidly provision applications and infrastructure, achieving the speed and agility needed for a quality service delivery:

The Infrastructure Automation service is treated entirely as code that can be checked into Git, and it is part of the Git repo used by ArgoCD for deployment. It includes securing the required ServiceNow approvals, provisioning a new Kubernetes cluster and VMs, configuring VM databases via Ansible and updating the CMDB in ServiceNow:

When the ArgoCD application syncs the desired state with the Git repo, the system will deploy all the components, which causes resources to be created. This is the Infrastructure Automation service represented as a Kubernetes resource:

This is the view of those same services in the Infrastructure Automation UI:

Here is the completed deployment of all the required components in the Infrastructure Automation service:

In summary

Infrastructure Automation provides an enterprise Terraform capability that integrates with Red Hat Ansible Tower for configuration management. It also includes a comprehensive day 2 management and operations toolset based on the open-source ManageIQ. 

Infrastructure Automation drives complex service deployments from GitOps using APIs from Argo CD applications, while preserving the ability to do self-service deployments or integrating with other portals. 

It also allows seamless integration with existing IT processes and tools like ServiceNow or CMBDs that aren’t directly integrated with the Kubernetes control plane.

Learn more

IBM Cloud Pak for Watson AIOps capabilities are designed to support and enhance a broad range of IT practices, including DevOps, SRE and service management. Its outcome focus includes anomaly detection, event correlation and root cause analysis to improve monitoring, service management, cloud charge-backs and automation tasks.

Was this article helpful?
YesNo

More from Cloud

Top 6 innovations from the IBM – AWS GenAI Hackathon

5 min read - Generative AI innovations can transform industries. Eight client teams collaborated with IBM® and AWS this spring to develop generative AI prototypes to address real-world business challenges in the public sector, financial services, energy, healthcare and other industries. Over the course of several weeks, cross-functional teams comprising client teams, IBM and AWS representatives worked to design, develop and iterate on prototypes that push the boundaries of what's possible with generative AI. IBM used design thinking and user-centric approach to guide the…

IBM + AWS: Transforming Software Development Lifecycle (SDLC) with generative AI

7 min read - Generative AI is not only changing the way applications are built, but the way they are envisioned, designed, tested, documented, and deployed. It’s also revolutionizing the software development lifecycle (SDLC). IBM and AWS are infusing Amazon Bedrock generative AI capabilities into the IBM® SDLC solution to drive increased efficiency, speed, quality and value in every application lifecycle consistently and at scale. The evolution of the SDLC landscape The software development lifecycle has undergone several silent revolutions in recent decades. The…

How digital solutions increase efficiency in warehouse management

3 min read - In the evolving landscape of modern business, the significance of robust operational and maintenance systems cannot be overstated. Efficient warehouse management helps businesses to operate seamlessly, ensure precision and drive productivity to new heights. In our increasingly digital world, bar coding stands out as a cornerstone technology, revolutionizing warehouses by enabling meticulous data tracking and streamlined workflows. With this knowledge, A3J Group is focused on using IBM® Maximo® Application Suite and the Red Hat® Marketplace to help bring inventory solutions…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters