8 minutes
Most of today’s advanced enterprises have already migrated a significant portion of their IT infrastructure to the cloud. Using virtualization technologies from companies that have become household names in the space—such as VMware, Citrix and Nutanix—organizations can successfully virtualize servers, desktops, applications and more, reducing costs and dramatically shortening DevOps lifecycles.
But virtualization in its simplest form, using virtual machines (VMs) to run software hosted elsewhere, is over two decades old. Today, newer, more efficient approaches are disrupting the market, enabling new capabilities and efficiencies that enterprises need to thrive. In particular, containers—application code that can run in any computing environment—are changing the way organizations build, deploy, maintain and modernize applications.
Still, taking a traditional approach to virtualization makes sense for some enterprises. After all, many companies have relied heavily on VMs in the past, and it’s unrealistic to think they’ll be able to change overnight—or that they’d want to. But when it comes to building and deploying modern applications that can tap into the power of transformative technologies like generative AI, a cloud-native approach that utilizes a hybrid cloud application platform has many undeniable advantages.
“Many companies are looking at their existing VM infrastructure and trying to find new efficiencies,” says Kyle Brown, IBM Fellow and CTO of Cloud Architecture at IBM Cloud Labs. “And often what that means is this journey away from VMware towards a more modern, hybrid cloud infrastructure that utilizes containers.”
Modern cloud infrastructure, where organizations and individuals leverage virtual compute resources to run software and applications like word processors, email, texting and social networking, wouldn’t be possible without virtualization and virtual machines (VMs). At its most fundamental level, virtualization creates an abstraction layer over a device’s hardware, allowing its components—including processors, memory, networks and storage—to be divided into VMs. Each VM can run its own operating system (OS) like a separate, physical computer, even though it shares the same hardware.
The use of shared resources via VMs and virtualization transformed IT infrastructure in the early 2000s. This new compute environment, known as “the cloud,” caused many organizations to overhaul the way they managed compute resources and demand greater agility, flexibility and cost savings from their IT environment. In the cloud, rather than purchasing critical resources like CPUs and application servers outright, businesses started to buy what they needed on a pay-as-you-go model. As workloads changed, they could simply scale their VMs up or down as needed.
While new approaches, better tuned to the demands of emerging technologies like artificial intelligence (AI) and the Internet of Things (IoT) have caused some reshuffling, the global market for VMs has remained strong. Last year, Precedence Research valued the global VM market at USD 11.1 billion and projected it to grow at a compound annual growth rate (CAGR) of almost 15% over the next 10 years.1
While VMs were pivotal in moving IT infrastructure from on-premises architectures into the cloud, they have been unable to keep up with the demands of modern app development. From customer relationship management (CRM) tools like Salesforce and Hubspot, to collaboration solutions like Zoom and Slack, to gen AI platforms like ChatGPT, applications have come to define modern enterprises.
To sustain the level of innovation and support needed to run these applications, companies need to support and manage an evolving DevOps environment with the latest tools and technologies. Unfortunately, while VMs were considered cutting-edge in the early years of cloud computing, as app dev practices evolved, their limitations have become more and more apparent.
Here are a few of the challenges VMs face in modern app dev environments.
Many VMs are incompatible with cutting edge tools that developers need to build, deploy and support modern applications. Microservices, for example—a method of building apps out of loosely composed, independent services and components—work much better on containers.
VMs run on a hypervisor, a software layer that enables multiple operating systems to run on the same physical server. However, each VM that the hypervisor provisions requires its own CPU, memory and storage, which can slow processing times on the host machine. Newer technologies based on containers don’t require an OS to be booted for each workload instance, which can lead to much faster startup times.
VMs run copies of an entire OS and any hardware that’s required for the OS to run, like a server, networking component or desktop. This places significant demands on system memory and CPU cycles that newer technologies, like containers, don’t have.
With all the constraints VMs have in modern app ecosystems, it was inevitable that a technology would come along that was a better fit. Containers are executable units of software that contain application code, the code’s libraries and all its dependencies, which means the code can run in any compute environment. Containers are an evolution of a specific kind of virtualization that uses a kernel—a computer program that gives a user control over a piece of hardware and the amount of compute resources it needs to run.
Due to their portability and efficiency, containers quickly became the preferred compute units for modern applications to run in the cloud. Additionally, with the rise of hybrid cloud environments, where on-premises, private and public cloud resources are combined to form a cohesive IT infrastructure, containers have become critical to the core operations of many large businesses. According to Global NewsWire, the global application container market was nearly USD 4 billion last year and is expected to reach nearly USD 30 billion in the next 6 years, growing at a CAGR of 25.87%.2
While many modern enterprises still use VMs to run legacy applications, containers have become the preferred virtualization technology for modern app dev teams. When developers set out to build, test and deploy new applications in the cloud, they rely on containers so the apps they build will run in any environment, including private, public and hybrid multicloud environments. Additionally, containers are optimized for modern DevOps practices like AI-code-generation and continuous integration and continuous deployment (CI/CD) implementation.
While the idea at the core of container technology—isolating application code so it could be tested and run without disrupting other services—has been around since the 1970s, it wasn’t until 2013 that it became a truly transformative technology with the release of Docker. Docker was the first open-source platform to facilitate the creation, deployment and management of containers.
Docker transformed containerization, streamlining and simplifying the way developers could build, deploy and manage the code that underpinned some of most popular distributed applications in the world, including ecommerce sites, social networks, banking platforms and more.
Docker made app-dev easier in large part because it provided a consistent development environment across any kind of machine, speeding DevOps lifecycles and simplifying workflows. Additionally, Docker is open source, meaning its constantly being iterated on and improved by developers who can freely modify its code. This led to even wider adoption of Docker, making it nearly ubiquitous in today’s advanced DevOps environments.
While the release of Docker in 2013 transformed the way DevOps teams built and deployed application code using containerization technology, it wasn’t until the release of Kubernetes, the first open-source container orchestration platform, in 2015 that containerization became scalable at an enterprise level. Kubernetes was fundamental in automating the deployment of containerized applications, reducing costs, time and complexity of DevOps lifecycles.
With Docker and Kubernetes, the foundation was laid for what would become the modern app-dev ecosystem, underpinned by the fundamentals of containers and cloud computing but ready to meet the challenges of emerging technologies like gen AI.
Perhaps no single term defines the modern app-dev environment—or captures how far it has evolved from the early days of cloud computing—than the term cloud native. Cloud native describes an approach to building applications out of discrete, reusable components (known as microservices) that are compute-environment agnostic, meaning they can run virtually anywhere.
Unlike traditional applications which can be adapted to run in the cloud, cloud-native apps rely on cloud technologies like Docker and Kubernetes and are designed to leverage key features of cloud environments like auto-scaling, load-balancing and managed services.
Because microservices can be deployed independently, without causing disruptions to the end-user of an application, they are well-suited to modern DevOps environments that rely heavily on the automation of critical tasks like code integration, testing and deployment.
Constantly on the lookout for greater scalability, resilience and cost benefits, modern organizations are increasingly taking a cloud-native approach when it comes to building, deploying and maintaining their applications.
“IT managers usually feel the pressure from their developers initially,” says Ryan Dejana, a senior member of the IBM CIO Hybrid Cloud Integrated Platform Team. “They want to start moving in a cloud-native direction, embracing Kubernetes, Docker and other containerization technologies, because most of the software they work with is already there.”
Here’s a closer look at some of the enterprise benefits a cloud-native approach to app dev can yield.
While many enterprises would probably love if a switch existed that would simply migrate their existing IT infrastructure from the VM era to a more modern, cloud-native approach, unfortunately, it isn’t that simple. When it comes to impactful digital transformation, rather than shutting down VMs and building new, cloud-native applications from scratch, organizations need to take an approach that is more nuanced.
“The beauty of some of these new platforms,” says Dejana, “is that you don’t have to get rid of your legacy infrastructure overnight to leverage the benefits of a cloud-native approach.“
To modernize a legacy app built on monolithic architecture using outdated programming language and older technology (or, really, for any other reason), organizations must do four things: Assess, containerize, orchestrate and integrate.
Before embarking on any of the newer containerization technologies that modern apps are built on, enterprises must first assess which of their applications, platforms and services need to be modernized. Typically, these are their most widely used, valuable and innovative apps.
VMs, though outdated, are still fine for many older applications, such as those with well-defined purposes that aren’t likely to change or require massive scalability. Calculators, text editors and accounting systems are a few examples.
Once an organization has identified the applications it needs to modernize, it must undertake the process of moving them from legacy infrastructure (usually VMs) into containers. This usually involves Docker or another containerization solution to package the app code and libraries and configuration requirements.
Once the app code has been containerized and the app has been deployed, platforms like Kubernetes can help manage, configure and deploy the container, as well as scale it up and down as needed and update it with new features and bug fixes. Kubernetes and other orchestration tools also help configure an application's networking functions so it can interact with other apps and services, such as databases.
Once an app has been containerized and is under the control of an orchestration tool, it’s ready to be integrated into an organization’s new cloud-native app ecosystem. Unlike legacy apps, newer cloud-native apps—as well as legacy apps that have been modernized—will need to be supported by cloud-native tools and processes. Examples of these include automated testing and debugging tools like Github and CoPilot, and application performance monitoring (APM) solutions like Datadog and Instana.
Enterprises that want to modernize their application infrastructure and support the development of cutting-edge applications that leverage technologies like gen AI need to give their developers the right tools. Hybrid cloud architecture applied to the app-dev environment allows DevOps teams tasked with innovating new applications as well as modernizing old ones to flourish.
Taking advantage of the latest in cloud technologies, hybrid cloud platforms help teams self-manage their journey to cloud native with features like enhanced observability, automated updates and migration toolkits.
“Organizations that utilize a hybrid cloud platform can easily work with containers and VMs, using tools that allow them to talk to each other,” says Dejana. “This makes it easier for the whole enterprise to move from a peer VM-based environment towards a more cloud-native, container-centric model, which is critical for building, deploying and managing today's most advanced applications.”
Get started with a fully managed Red Hat OpenShift platform. Accelerate your development and deployment process with scalable, secure solutions tailored to your needs.
Streamline your digital transformation with IBM’s hybrid cloud solutions, built to optimize scalability, modernization, and seamless integration across your IT infrastructure.
Unlock new capabilities and drive business agility with IBM’s cloud consulting services. Discover how to co-create solutions, accelerate digital transformation, and optimize performance through hybrid cloud strategies and expert partnerships.
1. Virtual Machine Market Size, Share, and Trends 2025 to 2034, Precedence Research, February 2025
2. Application Container Market Surges to USD 29.70 Billion by 2031, Global NewsWire, May 2024