How to install our product: Local vs shared installation
StacyPedersen 270006DB3S Visits (5775)
IBM® Spectrum Conductor with Spark is a multitenant solution for Apache Spark, enabling you to efficiently deploy and manage multiple Spark deployments. When you install IBM® Spectrum Conductor with Spark version 2.2.0, you can choose to perform either a local installation or a shared installation. Previously, these installations were referred to as a local file system installation and a shared file system installation.
Which one should you choose?
Option 1: Local installation
When you install IBM® Spectrum Conductor with Spark to a local environment, you install the product onto every host in the cluster.
You must install and configure IBM® Spectrum Conductor with Spark on at least one management host. Management hosts provide specialized services to the cluster. The first management host that you install on becomes your master host. Additionally, you can install IBM Spectrum Conductor with Spark on one or more compute hosts. Compute hosts provide computing resources.
If you want to set up a high availability (HA) environment for the management hosts (specifying a shared location for configuration information and common files that management hosts look up), you perform a local installation and then configure a shared directory for failover.
Option 2: Shared installation
When you install to a shared environment, you install IBM® Spectrum Conductor with Spark once on a shared file system (such as IBM Spectrum Scale™) that every host in the cluster shares and has access to.
Before you install the product, you install the shared file system to your environment. For shared storage, you can use IBM Spectrum Scale or Network File System (NFS). For a list of supported file systems, see Supported file systems for high availability.
Do you have any install questions? Let us know using our Slack channel!