4 reasons why hybrid cloud is becoming the new normal

Share this post:

At IBM Interconnect 2015 we heard the prediction that “cloud” will become the common name for what is known today as “hybrid cloud.” In other words, there will be no division between public and private clouds; every cloud will be a combination of the two, either with or without a traditional IT infrastructure. My colleague Ariel Jirau and I want to share our thoughts on the subject.

To understand why every cloud in the future will be a hybrid cloud, you must realize that cloud is a key enabler of business model transformation. To remain competitive and relevant, every business must transform and adapt.

Let’s examine four major reasons that support the assertion that hybrid clouds will be the standard.

The majority of businesses have some type of existing IT infrastructure, which makes it unrealistic to move all services to the cloud.

As much as service providers would like, few businesses will be able to completely eliminate their IT infrastructure and rely solely on the cloud. Many businesses sign long-term leases for data center space and purchase capital assets for their IT infrastructure that are depreciated over several years. Most businesses simply aren’t positioned to move all legacy applications to the cloud.

While many providers define a hybrid cloud as the utilization of private and public clouds, IBM opens up the idea that traditional IT can be paired and integrated with external or even internal clouds. This means that you don’t need to move all of your traditional IT infrastructure to a private cloud before you can use a hybrid cloud approach.

Businesses can no longer afford to build out dedicated infrastructures that have been designed to support peak capacity needs.

The typical IT infrastructure must be designed and built for peak utilization. These processes typically take between three and 12 months, given the lead time to deploy and achieve steady state.

The average utilization of physical servers tends to be no more than 12 to 18 percent. Businesses have begun to virtualize their physical servers to potentially increase this utilization rate into the 30 to 40 percent range, if the virtualized servers are in a resource pool.

One way to increase utilization is to create a private cloud. This cloud would essentially be a group of virtualized servers in resource pools with self-service portals that allow new server workloads to easily be created, modified and deleted. This would expand the ability of a business to use the private cloud across lines of business with different peak needs, and can help drive utilization up to approximately 50 percent.

To drive physical servers up near 80 percent usually requires multi-tenant public clouds with a mix of businesses that have varying peak demand time periods. Many consider this to be an oversubscription of servers for individual customers, but this model can be successful since customers will have peak needs at different times.

Customer expectations continue to shift.

Gone for many are the days of finding a product or service and sticking with it over time. The Internet affords customers the ability to quickly compare prices, availability and reviews of similar products and services. In addition, social media influences purchasing decisions, raising the demand of low sales products and services and often taking local demand to global levels. This drives variability to individual business demand, and non-cloud IT infrastructures simply cannot be adjusted quickly enough to avoid lost sales and lost business opportunities, as customers will simply move on to the next provider.

Regulations often force some data to remain on client premises.

While business demands may shift from local to global, there are often local regulations that force some business data to remain local. Typical examples are the European Union Data Protection Directive, as well as any governmental regulations for finance, banking, healthcare or insurance. Keeping this data local while utilizing a cloud for global expansion of the business can be a great solution to these issues.

Join in the conversation and let us know what you think by leaving a comment below or by reaching out to us on Twitter @DavidWeck and @AJirau.

This post was co-authored by Ariel Jirau

More stories

Why we added new map tools to Netcool

I had the opportunity to visit a number of telecommunications clients using IBM Netcool over the last year. We frequently discussed the benefits of have a geographically mapped view of topology. Not just because it was nice “eye candy” in the Network Operations Center (NOC), but because it gives an important geographically-based view of network […]

Continue reading

How to streamline continuous delivery through better auditing

IT managers, does this sound familiar? Just when everything is running smoothly, you encounter the release management process in place for upgrading business applications in the production environment. You get an error notification in one of the workflows running the release management process. It can be especially frustrating when the error is coming from the […]

Continue reading

Want to see the latest from WebSphere Liberty? Join our webcast

We just released the latest release of WebSphere Liberty, It includes many new enhancements to its security, database management and overall performance. Interested in what’s new? Join our webcast on January 11, 2017. Why? Read on. I used to take time to reflect on the year behind me as the calendar year closed out, […]

Continue reading