Service management for hybrid clouds

Share this post:

To make sure we speak of the same hybrid cloud, let me provide a short definition from the NIST:

The cloud infrastructure is a composition of two or more distinct cloud infrastructures (private, community, or public) that remain unique entities, but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load balancing between clouds).

The hybrid model of cloud computing combines the advantages of both private and public deployment models. The classification between those two models is shown in the following graphic:

I believe that hybrid clouds will be the future of cloud computing. There are a couple of workloads that are perfectly suited for the deployment in pure private or public environments.

Imagine a private cloud infrastructure, which demands a huge amount of CPUs for a couple of days a year, for example several retail applications during Christmas time. During this time, all applications have a high workload, so more resources are needed than the private cloud can provide. Of course, the cloud can be sized with more hardware to manage the demand, but for the remainder of the year, the hardware would not be used. So even if the retailer uses private cloud technology, the resource savings would be marginal.

Of course, public cloud would be an answer, but what if retailers don’t want to run their applications entirely in a public infrastructure for security reasons? The idea is to offload the additional workload to a public cloud or move non critical applications to the public cloud.

The concept behind hybrid cloud computing is much more than provisioning of virtual machines. Let me give you one example of why it is necessary to manage cloud services.

It has become very common that employees are paying cloud infrastructure services with their private credit card and at the end of the month they get back their money through expense reimbursement. This might work for test and development environments, but as soon as it comes to production environments companies want to have control and insight into their assets. No matter which IT service management framework is used by an organization, it definitely requires management and automation of cloud service assets.

Service management considers the full lifecycle of a cloud service. It starts with the definition of the service and is followed by the creation and registration of the service. After the subscription, typically by the line of business, an administrator needs to operate the service. This might also include passing on the charges to another department. At the end of the service lifecycle, each service instance must be terminated.

The goal is to achieve greater visibility, control and automation into a company’s assets and computing environments, regardless of where they reside.

I am not aware of any solution in the market that provides the capability to integrate IT service management and service management of cloud assets. This was our vision for a product called Cast Iron that we acquired in March 2010. IBM WebSphere Cast Iron is the leader in data integration for on-premises applications to cloud applications, such as SAP to If you want to know more about Cast Iron, I recommend blog entry “Cloud integration with Cast Iron” by Megan Irvine.

Cast Iron hides the complexity of interfaces by using predefined connectors, which can make data integration very easy to do. Service management for hybrid clouds is nothing more than using (rather complex) APIs of different applications and transferring data. Cast Iron provides the capability to connect many cloud applications and a powerful platform. Combined with Tivoli software for hybrid clouds, off-premises resources can now be managed to the same standard –with the same infrastructure – as resources inside enterprise walls.

This approach is realized by optionally installable bundles for both the physical and virtual Cast Iron integration appliance. So you can easily add IT services integration to Cast Iron integration capabilities. Currently, these bundles are offered as a complimentary download package for the Cast Iron appliance. The current version supports four use cases:

  • Data Integration covers the traditional Cast Iron integration scenarios, such as synchronization of customer records between an ERP and a CRM system. This is nothing new; this has already been proven at hundreds of customer sites.
  • Hybrid Monitoring facilitates monitoring and managing off-premises resources using an existing Tivoli monitoring installation. It allows gaining visibility into off-premises “cloud” infrastructure, using the same UI as those on the premises. A typical example would be to include monitoring for Amazon EC2 or IBM SmartCloud Enterprise instances into existing monitoring infrastructure.
  • Hybrid Security manages access to off-premises resources and data using the same user directory maintained on the premises. Employees will be able to quickly register to cloud services and create new accounts. This works out until employees leave the company. Sure, they might sign a paper that they have deleted all their accounts, but this needs to be controlled in some way. A company will be in deep trouble if an employee does not delete the account and starts working for a competitor and still has insight into critical business data. This scenario provides the ability to automatically add and subtract users from LotusLive based on an on-premises LDAP user-directory.
  • Hybrid Provisioning enables creation, management, and tear-down of server resources used on cloud infrastructure providers. This way provides a single point of control for the provisioning and management of on- and off-premises resources from Tivoli Service Automation Manager user interface. This scenario gets more interesting when the full capabilities of Tivoli Service Automation Manager get leveraged. The use of a rules engine, such as IBM ILOG, facilitates the possibility to provision resources based on a rule set, for example test instances get deployed at the public cloud provider with the lowest cost.

The combination of the Hybrid Provisioning and the Hybrid Monitoring bundle allows the optimization of resource allocation based on dynamic workload behavior. Remember the retailer scenario I described. This scenario would enable the retailer to dynamically scale out and scale in workloads, if load increases or decreases. The retailer would define threshold levels to classify the intended resource utilization. In case the actual workload exceeds the threshold value, new resources would be requested automatically. When the workload decreases, these additional resources will be deprovisioned. This approach enables control of costs, IT capacity, and regulatory concerns.

This holistic approach to hybrid cloud through the combination of application data and IT services integrations has been implemented in pilot projects at several clients. This step has been the first step to provide some common scenarios. Definitely we will move on; this has been only the tip of the iceberg. I’m pretty sure that there are more scenarios to explore and many other applications to connect.

More stories

Why we added new map tools to Netcool

I had the opportunity to visit a number of telecommunications clients using IBM Netcool over the last year. We frequently discussed the benefits of have a geographically mapped view of topology. Not just because it was nice “eye candy” in the Network Operations Center (NOC), but because it gives an important geographically-based view of network […]

Continue reading

How to streamline continuous delivery through better auditing

IT managers, does this sound familiar? Just when everything is running smoothly, you encounter the release management process in place for upgrading business applications in the production environment. You get an error notification in one of the workflows running the release management process. It can be especially frustrating when the error is coming from the […]

Continue reading

Want to see the latest from WebSphere Liberty? Join our webcast

We just released the latest release of WebSphere Liberty, It includes many new enhancements to its security, database management and overall performance. Interested in what’s new? Join our webcast on January 11, 2017. Why? Read on. I used to take time to reflect on the year behind me as the calendar year closed out, […]

Continue reading