Apps

The effects of cloud-native architecture on continuous delivery

Share this post:

Cloud native continuous deliveryModern, “cloud-native” architectures are making a big impact on continuous delivery.

Continuous delivery is a set of practices and tools that enables rapid testing and release of changes. These architectures typically have many moving pieces, each of which is intended to stand on its own. Generally, the technologies of choice for these applications will be container-based, including Kubernetes or Cloud Foundry, or functions as-a-service.

As these architectures take hold, two concerns keep cropping up for our customers and prospects. The first is the belief that that the continuous delivery tools currently used in their organizations are too burdensome for cloud-native application teams. The second is a burgeoning need for help with release management.

These concerns seem to be in conflict, but have roots in the same underlying dynamics.

Continuous delivery for existing apps

Most applications in the enterprise are three-tier web applications written predominately in Java or .Net. Sprinkled in are message queues, service buses and other assorted middleware. Most of this is running in virtual machines. Each element of an application can be built independently and the practice of continuous integration is mainstream enough that generally every code commit produces a new build.

Unfortunately, it is also pretty common for a change in one component to break something else in the runtime test environment, which is why application release automation tools, such as IBM UrbanCode Deploy, have been thriving for the past 10 years or so.

UrbanCode Deploy picks up the builds from continuous integration tools, deploys the larger application to test environments and production environments. It also tracks the collection of web services, front ends, message queue settings and database schema updates to ensure they fit together properly, and even ensures they are deployed in the right order.

Put simply, a continuous integration (CI) tool builds your stuff and an application release automation tool orchestrates the delivery of lots of related stuff.

How continuous delivery is different for cloud-native apps

Cloud-native applications are presumed to be loosely coupled. Recognizing that orchestrating changes and testing large, integrated systems is expensive and slow, architects are pushing for each service to have well-defined APIs and responsibility. The ideal result is that a change to a service can be quickly tested with minimal expected impact on other services. In theory, a typical production deployment should impact only a single service and the deployment itself should be quite easy, often just one or two command-line calls.

These trends appear to cut at some of the core value of application release automation: Keeping track of many interrelated services and orchestrating complex deployments. It’s easy to see why we get questions about release automation tools being overkill for some cloud-native applications. It appears that simpler build pipeline tools such as Jenkins might be a better fit.

The rise in concern about release management is more curious. If services are flying to production independently, shouldn’t release management be getting easier? When we dig into this, we find two practical matters are in the way.

The first is that cloud-native microservices are not as decoupled as when drawn up on the whiteboard. While deployments are still pretty easy for many applications, keeping track of what is in a test lab, and understanding how that lab is different from production, gets harder as the number of services grows.

The second matter is the temptation to use simpler tools, better suited to cloud-native development, for more traditionally architected applications that need deeper coordination and orchestration. Perhaps a cloud-native team was the DevOps pioneer and their toolset was then brought to the wider organization.

In either case, the need for more coordination than is present in build pipeline tools tends to surface as a demand for “release management” help.

The path forward

Ideally, organizations should examine their applications to determine how loosely coupled their architectures are and then choose pipeline solutions to match the coupling and deployment difficulty. Note that, while an application using newer platforms such as Kubernetes is more likely to be loosely coupled, there are no guarantees.

As observed in Accelerate: The Science of Lean Software and DevOps, “It’s possible to achieve these [architectural] characteristics even with packaged software and ‘legacy’ mainframe systems — and conversely, employing the latest whizzy microservices architecture deployed on containers is no guarantee of higher performance if you ignore these characteristics.”

For significant coupling and complex deployments, use a release automation tool. For users who have tried to decouple, implemented a series of Jenkins jobs, and now need to bring a bit of order and coordination, put lightweight release orchestration such as IBM UrbanCode Velocity over the top. Those who have fully decoupled can use the simplest pipelines. You’ve earned it.

For a deeper look at applying each of these patterns, please check out the webinar “The Future of Continuous Delivery.”

More Apps stories

4 steps to modernize and cloud-enable applications

Customers today are no longer satisfied by the traditional consumer-business relationship. Instead, they expect engaging and informative digital experiences. In order to match these expectations and stay ahead of the curve, organizations must lean into digital transformation. Businesses need to modernize both customer-facing and enterprise applications to support a customer-centric approach to business. Developing a […]

Continue reading

Simplify modernization and build cloud-native with open source technologies

Cloud-native technologies are the new normal for application development. Cloud-native creates immeasurable business value with increased velocity and reduced operational costs. Together, these support emerging business opportunities. Advancements in application development have focused on net new applications. We have seen that existing applications that cannot easily move to the cloud have been left on traditional […]

Continue reading

An intelligent approach to multicloud management

Here’s a staggering fact: According to an IBM Institute for Business Value study, 94 percent of enterprise customers surveyed stated they are using multicloud, multicluster environments (public, private and at the edge) to optimize cloud workloads and take advantage of innovation and avoid cloud vendor lock in. Yet, less than 40 percent have the procedures […]

Continue reading