Installing IBM Workload Scheduler

Available installation methods

This section provides the information required before you install the product. The available installation methods are listed, together with some considerations:

Advantages of the command-line installation
The command-line installation is a very simple procedure, which supports installing all components (master domain manager, backup domain manager, dynamic domain manager, backup dynamic domain manager, Dynamic Workload Console, and agents) using dedicated commands. You can choose to maintain the default values already defined in the properties file, specify all or part of the parameters in the command line when typing the command, or edit all or part of the parameters stored in the properties file. To proceed with the command-line installation, skip to Installing from the command-line interface.
Advantages of the Docker deployment
The Docker installation is comprised of a set of pre-installed images for the master domain manager, the Dynamic Workload Console, and the DB2 database. All you have to do is launch the Docker installation commands.

Docker is a state-of-the-art technology which creates, deploys, and runs applications by using containers. Packages are provided containing an application with all of the components it requires, such as libraries, specific configurations, and other dependencies, and deploy it in no time on any other Linux or Windows workstation, regardless of any different settings between the source and the target workstation.

Docker adoption ensures standardization of your workload scheduling environment and provides an easy method to replicate environments quickly in development, build, test, and production environments, speeding up the time it takes to get from build to production significantly. Install your environment using Docker to improve scalability, portability, and efficiency.

To proceed with the Docker installation, skip to Deploying containers with Docker.
Advantages of the Red Hat OpenShift deployment

The IBM Workload Automation product components can be deployed onto Red Hat OpenShift, V4.x. You can deploy IBM Workload Automation components using IBM® certified containers on a Kubernetes-based container application platform useful to orchestrate containerized applications. You can then manage the IBM Workload Automation containers from the OpenShift dashboard or from the command line interface.

The IBM Workload Automation agent container can be deployed onto OpenShift, V3.x, a Kubernetes-based container application platform useful to orchestrate containerized applications. By using OpenShift, you can deploy the IBM Workload Automation agent container with a template.yml file to quickly configure and run it as Docker container application in a Kubernetes cluster. You can then manage the IBM Workload Automation agent container from the OpenShift dashboard or from the command line interface.

With OpenShift, you can implement distributed, advanced and scalable services based on the Docker container technology and orchestrated by Kubernetes. For more information, see Deploying IBM Workload Automation components on Red Hat OpenShift.

Advantages of deploying on Amazon EKS

To respond to the growing request to make automation opportunities more accessible, IBM Workload Scheduler is now offered on the Amazon Web Services cloud. Within just a few minutes, you can access the product Helm chart and container images and easily launch an instance to deploy an IBM Workload Scheduler server, console, and agents with full on-premises capabilities on AWS. IBM Workload Scheduler on AWS improves flexibility and scalability of your automation environment. It helps in lowering costs and eliminating complexity, while reducing the operational overhead and the burden involved in managing your own infrastructure, so you can invest your time and resources in growing your business. Also, IBM Workload Scheduler on AWS delivers faster access to managed services solutions, for a full product lifecycle management.

For more information see Deploying on Amazon EKS.
Advantages of deploying on Azure Kubernetes Service (AKS)
You can use Azure AKS to deploy, scale up, scale down and manage containers in the cluster environment. Use the IBM Workload Scheduler Helm chart and container images to deploy the server, console and dynamic agent to the Azure AKS public cloud. Azure AKS gives you access to helpful services. For example, you can use the Azure SQL database, a highly scalable cloud database service. See Deploying on Azure AKS for more details.
Advantages of deploying on Google GKE

Google Kubernetes Engine (GKE) provides a managed environment for deploying, managing, and scaling your containerized applications using Google infrastructure. The Google GKE environment consists of multiple machines grouped together to form a cluster. You can also deploy and run Google Cloud SQL for SQL server.

Google GKE supports session affinity in a load balancing cluster, a feature which maintains each user session always active on the same pod. This ensures that the Dynamic Workload Console always connects to the same server during a session and that the user can perform any number of operations smoothly and seamlessly.

For more information, see Deploying on Google GKE.