Home : Computers : Virtualization

Virtualization

A completely satisfactory definition of the term “virtualization”, as used in the world of information technology (IT), is difficult to obtain because the concept takes so many different forms and has an extremely wide range of applicability. However, in perhaps its broadest sense, it may be said that virtualization refers to the reorganization and/or bundling of IT resources for the purpose of creating or enabling new capabilities. To further clarify, consider the following.

The folders and subfolders used to organize the computer files on a hard drive constitute a simple example of virtualization. Most computer users perceive these folders to be objects that contain files. However, folders are themselves files, no different from any other files, and in fact no such container objects actually exist. Therefore, it may be said that folders represent a virtualization of file container objects. It is worth noting that although folders represent a simple example of the virtualization concept, it would surely be difficult to exaggerate the usefulness of this little device.

Data overflow is a problem that is commonly resolved using virtualization. Sometimes an application will generate more data than a single physical server can hold. Furthermore, the generating application may require the data to be stored contiguously, with no convenient means to distribute the data across multiple servers. This problem might be resolved by modifying the application that generates the data. However, doing so would only solve the overflow problem for that particular application and there will doubtless be many other applications having the same problem.

A more effective solution would be a construct that could manage several physical storage devices in such a way as to give the illusion of a single storage device of great capacity and could interface with other applications as such. This solution exists and it is known as virtualized storage.

Perhaps the most famous example of virtualization, resulting in the creation of a new and strikingly powerful virtual capability, arose in the fight against server sprawl. A relatively recent problem, server sprawl refers to the unchecked proliferation of physical servers. This so-called sprawl can reach a point where a company finds itself lacking sufficient space to house all of its servers.

One of the main factors driving server sprawl was a gross underutilization of server processing capacity. This had become something of an unfortunate industry-wide norm, though not without reason. Historically, physical servers were unable to run incompatible software and, consequently, physical servers would have to be isolated and dedicated to specific tasks. Even if, as was often the case, these tasks used only a small percentage of the server’s processing power, nevertheless the entire server would have to be conscripted.
 
Of course, IT personnel had long been aware that their servers were being underutilized. The situation was maddening, and quests were continually ongoing in search of some means whereby incompatible software could be made to run on the same physical server so that processing capacity could be more fully utilized. A major stumbling block in this endeavor was the acknowledged necessity of trapping and redirecting privileged instructions. Many engineers believed this to be impossible because such instructions seemed too close to the bare machine.
 
In 1998, VMWare, a pioneer in the field of virtualization and the current industry leader, unveiled a very creative solution involving the dynamic rewrite of kernel level and real mode code, a technique that the company referred to as binary translation. Using binary translation to trap and redirect sensitive instructions, VMWare proceeded to virtualize a physical server, creating a virtual machine that would run on a physical server but was fully abstracted from the actual host server.
 
The VMWare virtual machine provided a completely disjoint runtime application environment, functionally mirroring a dedicated physical server. However, since the virtual machine was in fact a software construct, multiple virtual machines could run on one physical server, with each instance of a virtual machine running what would previously have been thought to be entirely incompatible software, thereby enabling tasks performed by dedicated physical servers to be redistributed and “packed” across servers that no longer had to be dedicated. Dramatic improvement in processor utilization was realized immediately.
 
Perhaps virtual machines did not entirely solve the problem of server sprawl, any more than does a broom entirely solve the problem of dirt. However, the new capability was certainly a very big help.
 
It has become something of a cliché for the corporate executive to point a finger at the IT department and say, “There’s a problem? Well, virtualize a solution!” However, virtualization is not a magic bullet. In fact, it is a tool that can be all too easily misused, not infrequently resulting in awkward constructs of the “square wheel” variety. When this occurred it was usually the tool that unjustly received the blame, rather than the hand wielding it.
 
Nevertheless, it is undeniable now that virtualization represents an effective, new approach to problem solving, and innovative virtualization ideas continue to excite the IT community. It is generally agreed that the virtualization approach holds great potential and offers new scope for the software engineer.