Data Centers today have to manage and mine massive volumes of data. Data that seems to be growing at the speed of light and data that is coming from disparate sources. All this can be very complex to manage and very expensive to maintain – for hardware, software, networking in the least. So many organizations are leaning towards moving data centers to Cloud environments to benefit from the shared services concepts and multiple deployment models. Cloud deployment models also leverage the advances in storage and server virtualization and provide these resources to cater to the dynamic systems demands for the organizations.
But the market needs are rapidly changing and thus demands are outpacing the system performance. Even with all the advances made in cloud technologies, the increasing complexity of the solution deployments and system demands are creating bottlenecks. There seems to be an ever increasing need for streamlining and smarter delivery of shared services to make data centers more dynamic and flexible.
The organizations are seeking ‘workload automation and optimization’ and are typically asking for the following:
Workloads: requirement to dynamically assign resources to application based on their needs and best available resources.
Open APIs: The enterprises deploy solutions based on a broad spectrum of solution providers. So there is a stronger demand for a programmable infrastructure based on Open APIs.
Continuous monitoring & optimization: The enterprises want to continuously monitor their systems for threat and for performance and respond immediately to both
Proactive management: There is always a requirement for improving the efficiencies of enterprise IT. The companies want to know the highs and lows of the market trends and align the service delivery needs accordingly. This will help with performing to market demands and controls costs as well.
Security & Compliance: Enterprises are constantly seeking Analytics based compliance checking. The requirement is to reduce security exposures and business risk.
Creating a simplified, responsive and adaptive infrastructure on cloud to implement such a broad set of capabilities needs another layer of automation. This automation is now possible through Software Defined Environment (SDE) approach and it provisions the entire infrastructure - compute, storage and network, by responding in real time to the workload demands.
SDE delivers the next generation infrastructure automation and implements all of the ‘workload automation and optimization’ requirements described above. It dynamically assigns the resources to applications based on their requirements, best available resources and the SLAs that are in place.
SDE approach is about encapsulating the business needs into business delivery processes. SDE does this by applying the entire set of business rules including the performance metrics, business policies and compliances into the infrastructure management processes. These rules define the resource requirements for storage, network and compute. When the business needs change the rules are dynamically altered and the SDE automatically orchestrates the resource allocation for the altering workloads. Truly, the overall IT infrastructure matters for Software Defined Environment. Stay tuned for my next space where I will discuss couple of simple examples to demonstrate how Software Defined Environments act as a foundation for mobile, social, big data & analytics and more. Till then if you have any questions or thoughts, join me on Twitter at sperepa. Thanks for reading!
Sujatha (Suj) Perepa
IBM Software Client Architect