Containers as a service (CaaS) is a cloud service model that allows users to upload, organize, start, stop, scale and otherwise manage containers, applications and clusters. It enables these processes by using either a container-based virtualization, an application programming interface (API) or a web portal interface. CaaS helps users construct security-rich, scalable containerized applications through on-premises data centers or the cloud. Containers and clusters are used as a service with this model and are deployed in the cloud or in onsite data centers.
Why is CaaS important?
A model with broad application, CaaS helps developers streamline the process of constructing a fully scaled container and applications deployment. The model is a boon for IT departments, providing an enabled container deployment service that has governance control in a security-rich environment. The CaaS model helps enterprises simplify container management within their software-defined infrastructures.
Similar to other cloud computing services, users can choose and only pay for the CaaS resources they want. Some CaaS resource examples are compute instances, scheduling capabilities and load balancing.
In the spread of cloud computing services, CaaS is considered a subset of infrastructure as a service (IaaS) and is found between IaaS and platform as a service (PaaS). CaaS includes containers as its basic resource, counter to the virtual machines (VMs) and bare metal hardware host systems commonly used for IaaS environments.
An essential quality of CaaS technology is orchestration that automates key IT functions. Google Kubernetes and Docker Swarm are two examples of CaaS orchestration platforms. IBM, Amazon Web Services (AWS) and Google are a few examples of public cloud CaaS providers.
Why are containers important?
Enterprise clients from all industries are seeing the benefits of CaaS and container technology. Using containers provides increased efficiency and gives these clients the ability to quickly deploy innovative solutions for application modernization and cloud native development with microservices. Containerization helps these clients release software faster and promotes portability between hybrid and multicloud environments, and reduce infrastructure, software licensing and operating costs.
Here are several of the client benefits for using containers:
- Portability: When an application is created in a container, that completed app has everything it needs to run, including dependencies and configuration files. Having portability allows end users to reliably launch applications in different environments and public or private clouds. This portability also grants enterprises a large amount of flexibility, accelerating the development process and making it easier to switch to a different provider or cloud environment.
- Highly efficient and cost cutting: Because containers don’t need a separate operating system, they require less resources than a VM. A container often requires only a few dozen megabytes to run, allowing you to run several containers on a single server that would otherwise be used to run a VM. This efficiency helps you cut data center costs. Containers also can cut bare metal costs, since they have a higher utilization level regarding underlying hardware and require less hardware.
Containers don’t interact and are somewhat isolated from other containers on the same servers, although they do share the same resources. If an application crashes for one container, other containers can continue to use it without experiencing any technical issues.
- Security: The isolation that containers have from one another doubles as a risk-minimizing security feature. If one application is compromised, then its negative effects won’t spread to the other containers.
Also, because containers run application processes in isolation from the operating system and don’t need specific software to run applications, it’s simpler to manage your host system. This benefit allows you to speedily launch updates and security patches.
- Speed: It only takes seconds to start a container, and to create, replicate or destroy a container, because containers don’t need an operating system book. This advantage also enables a quick development process, expedites the time to market and operational speed, and makes releasing new versions or software simple, quick and easier than before. This speed also helps with the customer experience, enabling enterprises and developers to speedily respond to bugs and incorporate new features as soon as customers address them.
- Scaling: Containers feature the capability for horizontal scaling, allowing end users to incorporate multiple identical containers within the same cluster to scale out. By using smart scaling and running only the containers that you need when you need them, you can dramatically cut costs and boost your return on investment.
- Streamlined development: Having an effective and efficient development pipeline is an advantage of container-based infrastructure. Because containers allow applications to work and run as if built locally, environmental inconsistencies are eliminated. This removal bolsters testing and debugging, making them less complicated and time-consuming. This feature also doubles for updating applications, only requiring the developer to modify the configuration file, then generate new containers and delete the previous ones; a process that should only take moments.
Managed container services and cloud container stacks
Enterprise clients looking to bolster their business by using containers must choose between two options:
A CaaS platform and deploying in either a public cloud or an onsite infrastructure platform
- A managed container service provided by Google, Amazon or Microsoft Azure, the three primary adopted public cloud providers
Either of these two options doesn’t necessarily lock in an enterprise. Since the Cloud Native Computing Foundation (CNCF) formed from a partnership with Google and the Linux Foundation and launched the Kubernetes Certification program, CNCF has ensured that all vendors maintain the standard for container portability and conformity across platforms.
Before an enterprise client chooses between a managed container platform or deploying onsite, they should answer the following questions:
- Does your container require an onsite deployment, or can it be deployed in the public cloud?
- Does your IT department have the necessary skills to design, deploy and administer a Kubernetes environment? What’s required to train or retain them?
- What’s the public cloud platform where you need to deploy containers? Examples are Google, AWS or Azure.
- Does the use of a multitenanted and shared Kubernetes control plane have any implications?
If your enterprise is still experimenting with containers, then managed container services may be your best choice. Managed container services are a good starting point that don’t require a cluster manager, resource provisioning or deployed minimum platform. A big benefit of managed container services is that they’re great for initial container deployment testing, then to tailor development and operational processes.
If your enterprise is already in a more established part of your container deployment on Kubernetes or AWS, or an onsite platform, then you may opt for your own CaaS solution. Bringing your own CaaS could give your enterprise a more feature-heavy platform, one that has the necessary frameworks and services needed for a production-grade system.
The rise of Kubernetes
Not unlike the war of Betamax versus VHS, the war for container orchestration dominance was predicted in Q4 of 2017 and concluded by Q2 of 2018. Google Kubernetes Engine, now known simply as Kubernetes, emerged the victor. With a clear winner, providers and adopters redoubled their efforts and focused on producing and maturing their deployments for Kubernetes.
The release of a managed Kubernetes service and hybrid container stacks helped to ease the adoption of Kubernetes. Because GKE helped pioneer the concept of managed container services, Kubernetes always maintained greater demand than the container services of its competitors, AWS and Azure. The mid-2018 releases of Amazon Elastic Container Service for Kubernetes (Amazon EKS) and Azure Kubernetes Service (AKS) cemented the dominance of GKE.
Kubernetes and container orchestration
Kubernetes (K8s) is a container orchestration system for automating application deployment, management and scaling. Originally designed by Google and open sourced in 2014, Kubernetes is maintained by the CNCF. The Kubernetes website describes Kubernetes as a “portable, extensible open-source platform for managing containerized workloads and services that facilitates both declarative configuration and automation”.
The three primary platforms that Kubernetes functions as includes:
- Portable cloud
A container-centric management environment, Kubernetes coordinates computing, networking and storage infrastructure for user workloads. Kubernetes includes the same ease of use as a PaaS along with the malleability of IaaS and the portability across infrastructure providers.
Kubernetes, IBM and Red Hat open journey
With a collaboration that has extended for two decades, IBM and Red Hat have been on a journey of exploration together. An early proponent for Linux, IBM worked together with Red Hat and helped to develop and support enterprise-grade Linux. More recently, this collaboration helped bring Kubernetes and hybrid cloud solutions to a wide array of customers. Kubernetes is also one of the foundations of the combined IBM and Red Hat hybrid cloud strategy.
These innovations became the core technologies of IBM’s USD 19 billion hybrid cloud business. After the acquisition concludes at the expected end of 2019, Red Hat will be the latest member and distinct unit for the IBM hybrid cloud team.
Kubernetes, IBM and Hertz
The Hertz Corporation, known simply as Hertz, celebrated its centennial birthday in 2018. Grappling with legacy technology issues, the company needed help to streamline its business architecture and technology. Hertz teamed up with IBM Cloud Garage™ and developed a Kubernetes architecture that helped the enterprise build and deploy microservices-based applications to IBM Cloud™ Private and the IBM Cloud Container Service.
By collaborating with IBM, Hertz modernized its core systems across digital channels, reservations, rates and others using flexible container and microservices architectures. With its expansive global reach, Hertz ensures that their applications are highly available, enterprise-scale, and is expected to receive 1.5 billion hits and 30 million updates daily once the applications are in production.
Kubernetes and open source software
The open source software is continuing to expand its influence, further establishing the importance of open source software in the world of information technology. The IBM Services™ white paper support solutions for your open source software environment notes that “96 percent of commercial applications have some kind of open source component” (PDF, 3.9 MB).1
In the world of open source software, Arturo Suarez has plenty of clout. Suarez created the first commercial distribution of the free and open source software platform for cloud computing, OpenStack. In a 2019 IT Biz Advisor interview, Suarez describe his experience with Kubernetes saying “Kubernetes [is] winning the container orchestration race” and that “Kubernetes evolves even faster than OpenStack, with releases every three months, and has a better governance model and adoption curve.”2
IBM Cloud Kubernetes Service
A managed container service for quick application delivery, IBM Cloud Kubernetes Service can integrate with IBM Watson®, blockchain and other advanced services. Several features for IBM Cloud Kubernetes Service includes:
- Intelligent scheduling
- Horizontal scaling
- Service discovery
- Load balancing
- Automated rollouts and rollbacks
- Secret and configuration management
With advanced capabilities around user-friendly cluster management and the ability to design your own cluster, this IBM Cloud Kubernetes Service also offers container security and isolation policies, and integrated operational tools for deployment consistency.
IBM Cloud Kubernetes Service currently has more than 10 thousand managed paid production clusters and is used by clients like Think Research, Eurobits Technologies and The Weather Company, an IBM Company. Enterprises use IBM Cloud Kubernetes Service to perform the following tasks:
- Create clusters.
- Deploy a scalable web application on Kubernetes.
- Analyze logs and monitor the health of Kubernetes apps.
- Provide continuous deployment to Kubernetes.
For more information about how container services, cloud services and Kubernetes can help your business, speak to an IBM sales. You can also sign up for IBM Cloud Kubernetes Service to build and develop at no cost in the IBM Cloud environment. IBM is here to help you move your business forward with confidence.
Deeper diver into containers and Kubernetes
- Orchestration and mobility in a containerized world
- IBM Cloud Kubernetes Service – FAQ
- Using App ID to secure Docker and Kubernetes applications
- Availability of IBM Services for Container Platforms
- IBM Services. “Support solutions for your open source software environment,” IBM, 2019. https://www.ibm.com/downloads/cas/BWJWOJRD (PDF, 3,9 MB)
- Camilla Sharpe. “Q&A: How open source technology is shaking up the IT landscape,” IBM, May 16, 2019. https://itbizadvisor.com/2019/05/qa-how-open-source-technology-is-shaking-up-the-it-landscape