Deploying with containers
Deploy IBM Workload Automation quickly and easily with containers.
Following you can find more details about the IBM Workload Automation deployment with containers based on your environment.
- Docker containers
- An easy and fast deployment method of IBM Workload Automation. Docker
compose is a method to instantly download the product image, create a
container, and start up the product.
Docker is a state-of-the-art technology which creates, deploys, and runs applications by using containers. Packages are provided containing an application with all of the components it requires, such as libraries, specific configurations, and other dependencies, and deploy it in no time on any other Linux or Windows workstation, regardless of any different settings between the source and the target workstation.
Docker adoption ensures standardization of your workload scheduling environment and provides an easy method to replicate environments quickly in development, build, test, and production environments, speeding up the time it takes to get from build to production significantly. Install your environment using Docker to improve scalability, portability, and efficiency.
Docker containers are available for UNIX, Windows and Linux on Z operating systems.
For more information, see the introductory readme file for all components available at IBM Workload Automation. You can also find detailed information for each component in the related readme file, as follows:You can also use docker containers to store all the latest integrations available on Automation Hub. For further information see: Container plug-in.
- Amazon Web Services (AWS) Elastic Kubernetes Service (EKS) (Amazon EKS)
- You can use Amazon EKS to run IBM® Workload Scheduler containerized product components on the Amazon Web Services secure cloud platform.
- Azure Kubernetes Service (AKS)
-
Deploy and manage IBM Workload Scheduler containerized product components on the Azure AKS, a container orchestration service available on the Microsoft Azure public cloud. You can use Azure AKS to deploy, scale up, scale down and manage containers in the cluster environment. You can also deploy and run an Azure SQL database.
For more information, see Deploying on Azure AKS.
- Google GKE
Google Kubernetes Engine (GKE) provides a managed environment for deploying, managing, and scaling your containerized applications using Google infrastructure. The Google GKE environment consists of multiple machines grouped together to form a cluster. You can also deploy and run Google Cloud SQL for SQL server.
Google GKE supports session affinity in a load balancing cluster, a feature which maintains each user session always active on the same pod. This ensures that the Dynamic Workload Console always connects to the same server during a session and that the user can perform any number of operations smoothly and seamlessly.For more information, see Deploying on Google GKE.
- Red Hat OpenShift
-
You can deploy the IBM Workload Automation components using IBM certified containers. For further information, see Deploying IBM Workload Automation components using helm charts.