May 28, 2014 | Written by: Pethuru Raj
Share this post:
It is an undeniable fact that IT is being systematically simplified, made affordable, less complex, highly agile and widely available. This is thanks to a host of optimization techniques such as rationalization, consolidation, centralization, federation, virtualization, automation and sharing.
IT is on its venerable mission to be the greatest enabler of not only businesses but also people at large. Actually, the raging cloud idea succinctly represents the highly organized and optimized IT. That is, a variety of powerful and proven technologies in the competitive enterprise space have converged seamlessly and spontaneously to lay a stimulating and scintillating foundation for the widely discoursed and discussed cloud paradigm, which has been, since then, consistently on the spread.
(Related: Docker moves toward open cloud architecture)
Precisely speaking, cloud technology is neither revolutionary nor new but it has turned out to be a highly impactful and insightful concept with the convergence of multiple proven and promising technologies. Cloud-inspired results are incredibly tremendous for all kinds of worldwide enterprises in their outputs, operations and offerings.
One easy-to-grasp measurement criterion is that the pre-cloud IT infrastructure utilization was in the range of 10 to 15 percent whereas in the cloud era, the utilization has gone up remarkably to the level of 50 to 60 percent. As the IT budget for business houses and behemoths is shrinking steadily across the globe, the continued technological advancements such as the highly visible cloud concept could sustain businesses to be dynamic, responsive, and smart in their decisions, deeds and dealings.
In a nutshell, the transformative, disruptive and innovative cloud technology has brought in a series of significant transformations and optimizations to empower IT to affordably and artistically take on constantly evolving business expectations and peoples’ needs. However, there are instances and insistences on intrinsically and insightfully incorporating newer kinds of path-breaking technologies and tools in order to substantially empower and prepare the multifaceted cloud theme to contribute immensely in making IT smarter.
The instantaneous provisioning of IT resources, the dynamic fulfillment of the DevOps needs in the impending cloud era, the cool automation of application deployments, the sharp enhancement on IT utilization, etc. are some of the constrictions that can be effortlessly eliminated with the advancement such as the Docker containerization technology. We have crafted a Docker container (carrying the standard PetStore web application along with the underlying libraries and binaries) in IBM SoftLayer cloud as a first step. In this blog series, I would like to illustrate how the fast-maturing docker technology is a great value-add for the cloud paradigm in its long and arduous journey.
The Key Drivers
Clouds are being positioned as the next-generation IT infrastructure for hosting, delivering, and maintaining enterprise-class workloads across industry sectors. Highly process- and data-intensive applications (web, social, business, embedded, mobile, and analytics) and databases (SQL, NoSQL and NewSQL) are being meticulously modernized and migrated to cloud infrastructures (private, public and hybrid) to reap all the benefits of cloud computing.
There are a number of delectable advancements being derived in the cloud domain. Several cloud barriers are being identified and technology-sponsored solutions are being unearthed in order to make the mesmerizing cloud paradigm more pervasive and persuasive. There are a few noteworthy challenges and they are being addressed through creative technologies and tools. Normally when we deploy a web application in a server, we need to prepare by putting the necessary prerequisites in the server in order to run the web application flawlessly. This might mean setting up a database server, language runtime (Java, Python etc.) if there is a need to deploy a second application in the same server and the second application needs a slightly different version of a particular environment (say Python).
There are options for this need. One is to use different virtual machines (VMs) for different application versions in the server (a physical machine can be segmented into multiple VMs). But VMs come at a price and add a lot of overhead. Each VM carries its own operating system, resulting in a telling impact on performance, RAM usage and even disk space usage. What Docker does differently is that it allows to isolate applications and their environments from each other. The sickening dependencies between applications and their infrastructural modules gets totally eliminated thereby ensuring the long-lasting objective of any software running on any hardware without any manual intervention, interpretation and instruction.
There are other motivations for the overwhelming success of the Docker concept, which I will address in future blog posts.