What is container orchestration?
Explore IBM's container orchestration solution Subscribe for cloud updates
Illustration with collage of pictograms of computer monitor, server, clouds, dots
What is container orchestration?

Container orchestration automates and simplifies provisioning, and deployment and management of containerized applications.

Today, Kubernetes is the most popular container orchestration platform, and most leading public cloud providers - including Amazon Web Services (AWS), Google Cloud Platform, IBM Cloud®, and Microsoft Azure - offer managed Kubernetes services. Other container orchestration tools include Docker Swarm and Apache Mesos.

Realize the full value of your hybrid cloud

Connect and integrate your systems to prepare your infrastructure for AI.

Related content

Register for the guide on DaaS

More on containers, and why they need orchestration

Containers are lightweight, executable application components that combine application source code with all the operating system (OS) libraries and dependencies that are required to run the code in any environment. 

The ability to create containers has existed for decades, but it became widely available in 2008 when Linux included container functionality within its kernel. And then it was widely used with the arrival of the Docker open source containerization platform in 2013. (Docker is so popular that "Docker containers" and "containers" are often used interchangeably.) 

Because they are smaller, more resource-efficient and more portable than virtual machines (VMs), containers—and more specifically, containerized microservices or serverless functions—have become the de facto compute units of modern cloud-native applications. (For more on the benefits of containers see the interactive data visualization below)

In small numbers, containers are easy enough to deploy and manage manually. But in most organizations the number of containerized applications is growing rapidly, and managing them at scale. Especially, as part of a continuous integration/continuous delivery (CI/CD) or DevOps pipeline—is impossible without automation.

Enter container orchestration, which automates the operations tasks around deploying and running containerized applications and services. According to recent IBM research, 70% of developers that use containers report using container orchestration solution, and 70% of those report using a fully managed (cloud-managed) container orchestration service at their organization.

Download the full report: Containers in the enterprise
How container orchestration works

While there are differences in methodologies and capabilities across tools, container orchestration is essentially a three-step process (or cycle, when part of an iterative agile or DevOps pipeline).

Most container orchestration tools support a declarative configuration model: A developer writes a configuration file (in YAML or JSON depending on the tool) that defines a wanted configuration state. The orchestration tool that runs the file uses its own intelligence to achieve that state. The configuration file typically

  • Defines which container images make up the application, and where they are located (in what registry).

  • Provisions the containers with storage and other resources.

  • Defines and secures the network connections between containers.

  • Specifies versioning (for phased or canary rollouts).

The orchestration tool schedules deployment of the containers (and replicas of the containers, for resiliency) to a host. It chooses the best host based on available CPU capacity, memory, or other requirements or constraints specified in the configuration file. 

Once the containers are deployed the orchestration tool manages the lifecycle of the containerized application based on the container definition file (often a Dockerfile). This includes 

  • Managing scalability (up and down), load balancing, and resource allocation among the containers.

  • Ensuring availability and performance by relocating the containers to another host in the event an outage or a shortage of system resources.

  • Collecting and storing log data and other telemetry that are used to monitor the health and performance of the application.

Benefits of container orchestration

It's probably clear that the chief benefit of container orchestration is automation - and not only because it reduces greatly the effort and complexity of managing a large containerized application estate. By automating operations, orchestration supports an agile or DevOps approach that allows teams to develop and deploy in rapid, iterative cycles and release new features and capabilities faster.

In addition, an orchestration tool's intelligence can enhance or extend many of the inherent benefits of containerization. For example, automated host selection and resource allocation, based on declarative configuration, maximizes efficient use of computing resources; automated health monitoring and relocation of containers maximizes availability.


As noted above, Kubernetes is the most popular container orchestration platform. Together with other tools in the container ecosystem, Kubernetes enables a company to deliver a highly productive platform-as-a-service (PaaS). Addressing many of the infrastructure- and operations-related tasks and issues around cloud-native application development so that development teams can focus exclusively on coding and innovation.

Kubernetes’ advantages over other orchestration solutions are largely a result of its more comprehensive and sophisticated functionality in several areas, including:

  • Container deployment. Kubernetes deploys a specified number of containers to a specified host and keeps them running in a wanted state.

  • Rollouts. A rollout is a change to a deployment. Kubernetes lets you initiate, pause, resume, or roll back rollouts.

  • Service discovery. Kubernetes can automatically expose a container to the internet or to other containers by using a DNS name or IP address.

  • Storage provisioning. Developers can set Kubernetes to mount persistent local or cloud storage for your containers as needed.

  • Load balancing and scalability. When traffic to a container spikes, Kubernetes can employ load balancing and scaling to distribute it across the network to ensure stability and performance. (It also saves developers the work of setting up a load balancer.)

  • Self-healing for high availability. When a container fails, Kubernetes can restart or replace it automatically. It can also take down containers that don’t meet your health-check requirements.

  • Support and portability across multiple cloud providers. As noted earlier, Kubernetes enjoys broad support across all leading cloud providers. This is especially important for organizations deploying applications to a hybrid cloud or hybrid multicloud environment.

  • Growing ecosystem of open source tools. Kubernetes also has an ever-expanding stable of usability and networking tools to enhance its capabilities via the Kubernetes API. These include Knative, which enables containers to run as serverless workloads; and Istio, an open source service mesh. 

Learn more about Kubernetes

Related solutions
Red Hat OpenShift on IBM Cloud

Red Hat OpenShift on IBM Cloud leverages OpenShift in public and hybrid environments for velocity, market responsiveness, scalability, and reliability.

Explore Red Hat OpenShift on IBM Cloud
IBM Cloud Satellite

With IBM Cloud Satellite, you can launch consistent cloud services anywhere—on premises, at the edge and in public cloud environments.

Explore IBM Cloud Satellite
IBM Cloud Code Engine

Run container images, batch jobs or source code as serverless workloads—no sizing, deploying, networking, or scaling required. 

Explore IBM Cloud Code Engine
Optimize Kubernetes with IBM® Turbonomic®

Automatically determine the right resource allocation actions—and when to make them—to help ensure that your Kubernetes environments and mission-critical apps get exactly what they need to meet your SLOs.

Explore IBM Turbonomic
Resources Containers in the enterprise

New IBM research documents the surging momentum of container and Kubernetes adoption.

Combine the best features of cloud and traditional IT

Container orchestration is a key component of an open hybrid cloud strategy that lets you build and manage workloads from anywhere.

What is Docker?

Docker is an open source platform for building, deploying, and managing containerized applications.

Take the next step

Red Hat OpenShift on IBM Cloud offers developers a fast and secure way to containerize and deploy enterprise workloads in Kubernetes clusters. Offload tedious and repetitive tasks involving security management, compliance management, deployment management and ongoing lifecycle management. 

Explore Red Hat OpenShift on IBM Cloud Start for free