IBM Spectrum Conductor® is an enterprise-class, multitenant platform for deploying and managing Apache Spark, Anaconda, Dask and other application frameworks and services on a common shared cluster of resources. It provides the ability to support multiple concurrent and different versions of these applications while dynamically allocating and sharing resources between tenants and applications. IBM Spectrum Conductor provides enterprise security while enabling performance at scale and maximizes resource usage and sharing to consolidate resource silos that would otherwise be tied to separate application environments.
Use efficient resource scheduling and shared infrastructure results for shorter application wait times, higher throughput and faster analytics.
Maximize usage of resources and eliminate resource silos that would otherwise each be tied to multiple instances and different versions of Spark and other applications.
Get support for hundreds of applications and users and thousands of servers. Scale your IBM Watson® Studio AI jobs with distributed execution on an IBM Spectrum Conductor cluster.
Protect data with this multitenant solution that provides end-to-end security and runtime isolation. Conductor offers Spark and application lifecycle management with IBM support and services included.
Consolidated framework for deploying, managing, monitoring and reporting reduces administrative overhead.
Run the included Spark distribution for easier deployment of a full analytics environment both for exploratory projects and production.
Get security-rich support for multiple users and groups with authentication, authorization and runtime isolation. Also includes multiple applications, frameworks and services, such as Spark, Python, Anaconda, Dask and others on shared resources.
Add and remove additional resources from your cluster automatically, based on policy and workload demand, with support for a hybrid mix of on-premises and cloud hosts.
Simplify administration across a scale-out, distributed infrastructure.
Run multiple current and different versions of Spark and other compute frameworks and services concurrently on a secure multitenant platform.
Deploy and maintain Apache Spark with an integrated Spark distribution easily.
Use independent scaling of compute and storage infrastructure, along with flexible allocation of resources, according to application requirements for compute resources and memory.