August 12, 2019 By Kevin Jackson 2 min read

Composable architectures are quickly becoming a feature of the modern enterprise.

In its Hype Cycle for I&O Automation 2019 report, Gartner suggests particular benefits of composable infrastructures in “use cases where infrastructure must be resized frequently, or where composability increases the utilization of high-cost components.”

Why composable architecture?

Composable infrastructures deliver compute, storage and network resources as services from multiple logical resource pools.

The approach treats infrastructure like applications. It enables IT to construct new systems from collections of software-defined building blocks, which are managed as code. Infrastructure automation tools provision the required infrastructure on demand.

The challenge

The challenge with this approach is many organizations aren’t yet prepared to operate the data center automation and multiple clouds that are foundational to a composable infrastructure. They need to walk before they run. To effectively do so, enterprises should implement a data center infrastructure design and optimization strategy focused on application portfolio rationalization.

By tackling the modernization task this way—from the infrastructure side—legacy data centers become private clouds, and existing legacy or packaged applications get migrated onto this highly automated environment.

Taking this initial step toward a hybrid cloud environment enables a rational and collaborative adoption of public cloud infrastructure services (IaaS). It can also reduce friction often caused by the need to retrain staff in public cloud operations, modern infrastructure technologies and composable solution management tools.

IDC guidance on managing 

A recent report from IDC about the use of multicloud and hybrid cloud services highlights related composable architecture management issues such as:

  • The need for enterprises to become more agile in their responsiveness to customers
  • Organizational requirements to optimize ROI from all investments, especially IT
  • Requirements for addressing the technical complexity of adopting the cloud model
  • An imperative to transform CIO and IT function into a collaboration and integration hub for all other enterprise executives and functions

IDC suggests that organizations looking to partner with services firms to effectively deploy composable services architectures should work with firms that also provide:

  • Technology advisory and consulting services that help in application portfolio rationalization and modernization, robust security blueprint design, and data center infrastructure design and optimization
  • Transformational roadmap development support for upgrading legacy to private cloud using existing assets and replacing existing infrastructure with public cloud infrastructure as a service (IaaS)
  • The experience and know-how to deliver more agility and speed from IT, increasing revenue by enabling firms to build new revenue-generating products and services faster while also addressing the management complexities of deploying and managing applications across multicloud environments

This recommended path toward implementing a hybrid cloud environment optimizes the composable architecture strategy to build new revenue-generating products and services while simultaneously addressing key inhibitors to change.

Was this article helpful?
YesNo

More from Cloud

Apache Kafka use cases: Driving innovation across diverse industries

6 min read - Apache Kafka is an open-source, distributed streaming platform that allows developers to build real-time, event-driven applications. With Apache Kafka, developers can build applications that continuously use streaming data records and deliver real-time experiences to users. Whether checking an account balance, streaming Netflix or browsing LinkedIn, today’s users expect near real-time experiences from apps. Apache Kafka’s event-driven architecture was designed to store data and broadcast events in real-time, making it both a message broker and a storage unit that enables real-time…

Primary storage vs. secondary storage: What’s the difference?

6 min read - What is primary storage? Computer memory is prioritized according to how often that memory is required for use in carrying out operating functions. Primary storage is the means of containing primary memory (or main memory), which is the computer’s working memory and major operational component. The main or primary memory is also called “main storage” or “internal memory.” It holds relatively concise amounts of data, which the computer can access as it functions. Because primary memory is so frequently accessed,…

Cloud investments soar as AI advances

3 min read - These days, cloud news often gets overshadowed by anything and everything related to AI. The truth is they go hand-in-hand since many enterprises use cloud computing to deliver AI and generative AI at scale. "Hybrid cloud and AI are two sides of the same coin because it's all about the data," said Ric Lewis, IBM’s SVP of Infrastructure, at Think 2024. To function well, generative AI systems need to access the data that feeds its models wherever it resides. Enter…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters