Storage

Cost-efficient high-performance new-generation IT

Share this post:

New-generation IT management systems help to increase the usable capacity of your existing data centers, without the need to add more servers. It’s not uncommon to double or triple your computing capacity delivering faster answers and greater insights at lower cost.

The key is a software capability called cluster virtualization. It is common in high-performance computing (HPC) and supercomputing, and gaining use in commercial IT and data analytics.  Cluster virtualization is much more than running a bunch of virtual machines (VMs).

The fundamental thing to understand is that these new-generation workloads such as big data analytics, cognitive computing, artificial intelligence, machine learning, deep learning and Docker container environments are not like traditional commercial applications that are designed to run on a single computer or VM. They are designed to run across a cluster of computers that work together. This is a different architecture than traditional commercial IT; it is an architecture that has been used for decades in HPC and supercomputing.

Benefit from experience

In the early days of client-server computing, applications were deployed on their own servers, leading to very costly and inefficient “server sprawl.” Data centers were filled with underutilized servers that often consumed unnecessary space, electricity, cooling and the attention of administrations. Today we often see a similar mistake repeated with analytics and cluster-based apps, where each gets deployed on their own less-than-optimally utilized cluster.  This creates a new-generation problem called “cluster creep” or “cluster sprawl.”

To solve server sprawl, hypervisor software was used to virtualize systems so that each app ran in its own VM and shared a physical server – dramatically improving utilization and overall data center cost efficiency.

To solve cluster sprawl, cluster virtualization software is required, and like system virtualization, this software can dramatically improve utilization and cost efficiency. Cluster virtualization can also enable higher performance for critical workloads that is not possible when workloads are constrained in their own cluster silos.

Proven high-performance cluster virtualization for new-gen IT

IBM Spectrum Computing software has more than 2 decades of success running some of the world’s most complex and data-demanding workloads on shared compute clusters.  With the latest addition to the portfolio, IBM Spectrum Conductor, IBM has delivered this capability for today’s generation of open-source frameworks for big data analytics, machine and deep learning, cognitive computing & artificial intelligence and Docker container environments.  While other cluster virtualization options exist for individual frameworks, we don’t believe that any of them supports the diversity of workloads with the high-performance scale and reliability of Spectrum Computing.

Cost efficient speed = competitive advantage

Spectrum Computing can help clients achieve greater performance on the infrastructure they already own, and help defer anticipated hardware purchases, easing the strain on already tight budgets.

For organizations deploying a diverse set of new-generation workloads with or without traditional HPC and analytics, Spectrum Computing is designed to deliver cost-efficient, reliable and predictable performance at scale.  Don’t let your competition unlock this advantage before you do.

More Storage stories

Powerful new storage for your mission-critical hybrid multicloud

Flash storage, Multicloud, Storage

Cloud computing and mainframes processors are two of the most important contributors in information technology solutions. They are increasingly being linked together to unlock value for clients. 85 percent of companies are already operating in multicloud and by 2021, most existing apps will have migrated to the cloud. [1] At the same time, mainframe utilization ...read more


A conversation about metadata management featuring Forrester Analyst Michele Goetz

AI, Big data & analytics, Storage

I recently had an opportunity to speak with Forrester Principal Analyst Michele Goetz following a research study on the subject of metadata management conducted by Forrester Consulting and commissioned by IBM. Michele’s research covers artificial intelligence technologies and consultancies, semantic technology, data management strategy, data governance and data integration, and includes within its scope the ...read more


IBM is transforming data storage for media and entertainment

Data security, Storage, Tape and virtual tape storage

Disruptions in media and entertainment have been occurring at a rapid pace over the past few years and many organizations are still struggling with optimizing their IT infrastructure for new data requirements. With growth in capacity, new larger file formats, keeping more data online for business opportunities, and the growing use of AI, organizations are ...read more