journey from the early days of batch processing to these days of
virtualized computing has been truly an exciting march of progress.
The innovations and ideas have truly transformed the computing
landscape as we know it which promises of still more breathtaking
changes to come.
Programs written on the computers of those days used punch cards also
known as Hollerith cards. A separate terminal would be used to edit
and create the program which would result in a stack of punched card.
The different stacks of user programs would be loaded into a card
reader which would then queue the programs for processing by the
computers of those days. Each program would be executed in sequential
if our days were structured sequentially where would need a
particular task to complete fully before we start another one. That
would be a true waste of time. While each task progresses we could
focus on other tasks.
inefficiencies of batch processing soon became obvious and led to the
development of multi-tasked systems in which each user’s
applications is granted a slice of the CPU cycles for use. The
Operating System (OS) would cycle through the list of processes
granting then a specific number of cycles to compute each time. Soon
this development led to different operating systems including
Windows, Unix, Linux and so on.
Mutitasking evolved because designers realized that the Central
Processor Unit (CPU) cycles were wasted when programs waited for
input/output to arrive or complete. Hence the computer’s operating
system(OS) or the central nervous system would swap the user’s
program out of the CPU and grant the CPU to other user applications.
This way the CPU is utilized efficiently.
The pen analogy: For this analogy let us consider a fountain pen to be the CPU. While
Joe is writing a document, he uses the fountain pen. Now, lets assume
that Joe needs to print a document. While Joe saunters to pick up his
printout, the fountain pen is given to Kartik who needs his tax
report. Kartik soon gets tired and takes a coffee break. Now the pen
is given to Jane who needs to fill up a form. When Jane completes her
form the pen is handed over to Joe who just returned with his print
out. The pen (CPU) is thus used efficiently among the many users.
multi-tasking was a major breakthrough it did lead to an
organization’s applications being developed in different OS
flavors. Hence a large organization would be left with software silos
each with its own unique OS. This was a problem when the organization
wanted to consolidate all its relevant software under a common
umbrella. For e.g. A telecom operator may have payroll applications
that run on Windows, accounting on Linux and human resources on Unix.
It thus became difficult for the organization to get a holistic view
of what happened in the Finance department as a whole. Enter
‘virtualization’. Virtualization enables applications created for
different OS’es to run over a layer known as the “hypervisor”
that abstracts the raw hardware.
Virtualization in essence abstracts the raw hardware through a
software application called the Hypervisor. The Hypervisor runs on a
bare metal of the CPU. Applications that run over the Hypervisor can
choose the operating systems of their choice namely Windows, Linux,
Unix etc. The Hypervisor would effectively translate the different OS
instructions to the machine instructions of the underlying processor
The car analogy: Imagine that you got into a car. Once inside the car you had a button which when pressed would convert the car either into a roaring Ferrari, Lamborghini or a smooth Mercedes, BMW. The dashboard, the seats, engine all magically transformed into the car of your dreams. This is exactly what virtualization tries to achieve.
virtualization went further than just enabling applications created
on different OS to run on a single server loaded with the hypervisor.
Virtualization also enabled consolidation of server farms.
Virtualization brings together the different elements of an
enterprise namely the servers each with its memory, processors and
different storage options (disk attached storage (DAS), fiber channel
storage access network (FC SAN), Network Access Storage (NAS)) and
networking elements. Virtualization consolidates the compute, storage
and networking elements together and provides an illusion where
appropriate compute, storage and network are provided to applications
on demand. The applications are provided with virtual machines with
the necessary computing, storage and network units as required.
Virtualization also took care of providing high availability(HA),
mobility and security to the applications besides enabling an
illusion of shared resources. Besides if the any of the servers on
which an application is executing goes down for any reason the
application is migrated seamlessly to another server.
Assume that there was train with ‘n’ number of wagons. Commuters
can get on and get off at any station. When they get on the train
they are automatically allocated a seat, a berth and so on. The train
keeps track of how occupied the train is and provides the appropriate
seating dynamically. If the wheels of any wagon gets stuck the
passenger is lifted and shifted,seamlessly, to another wagon while
the stuck wagon is safely de-linked from the train.
has many applications. It is the dominant technology that is used in
the creation of public, private or a hybrid cloud thus creating
providing an on-demand scalable computing environment. Virtualization
is also used in consolidation of server farms enabling optimum usage
of the servers.
From my blog: http://gigadom.wordpress.com/
This does not represent IBM's views or strategies