While virtualization is not without its share of detractions, its allure is great. After all, it promises to make compatibility and resource management woes -- two eternal computer problems -- things of the past.
Imagine being freed from fears of software compatibility, and being able to reclaim your idle hardware for a purpose. That's the premise behind virtualization. While new users may still not have heard of virtualization or understand its benefits, the movement is growing and it may eventually become as ubiquitous a concept as the desktop operating system.
Virtualization is divided into two key categories -- application virtualization and platform virtualization. The goal of application virtualization is to provide compatibility layers. This could allow you to, say run an old Windows 3.1 program on a Linux operating system. The open source project Wine is an example of this kind of virtualization.
The second category is platform virtualization -- bringing the whole OS onboard. In this scheme, the kernel (a central component of the OS) may allow multiple servers (the interface users use to interact with the OS -- how we perceive Windows) to run at once. The second server can be run parallel to the first or as a guest program within the first, as with Windows 7's XP Mode.
Some platform virtualization implementations, such as XP Mode, use redirection of input to accomplish application virtualization via a full OS rather than a compatibility layer. The experience is more seamless as the Windows are displayed inline without the need to physically switch to a virtual machine window and navigate through its own windows.
A final important aspect of virtualization is the cloud computing movement. Cloud computing can involve sharing virtualized resources over a network. Shared resources are nothing new, but by virtualizing them, it lowers overhead and ensures that resources are efficiently used. Google, IBM, and Microsoft are just a few of the major names delving into this field. An example of cloud computing that even novice users can appreciate is Microsoft Office 2010, which will be available as free web applications for home users.
Virtualization is a fast evolving field and it is not without its problems and obstacles. One obstacle is cost. While costs of virtualization are dropping (Windows virtual machines for home users come with copies of Windows 7 Professional at no extra cost), deploying such a solution at your business may require some up-front IT investment and hours. In the long run it will save costs, but for cash-strapped businesses, this can be a deterrent to adoption, albeit one that will eventually be overcome.
Another more serious challenge is the increased likelihood of interruption of service. By putting more services on a single set of hardware you raise the damage that could be caused by a power failure -- either at your location or from your system's power supply. Furthermore, by running your hardware under larger workloads, you run the risk of a higher incidence of part failures. Again, this is unproven territory as the field is still young, but logic dictates that hardware flexed to its full capacity will likely experience more failures than hardware that spends a significant amount of time in idle. As the deployment of virtual machines is in the hands of system administrators, it's up to them to make sure that they're not overburdening hardware.
A final challenge is security. By placing multiple virtual machines on a single set of hardware, you run the risk of an escalated intrusion threat if an attacker can compromise the hypervisor layer -- the layer beneath the virtual machines. Also in a guest/host setup (like Windows XP Mode) you run the risk of guest-to-host attacks, which can exploit vulnerabilities in legacy operating systems or applications. Thankfully, such attacks have thus far been rare, but proof-of-concept exploits show the need for vigilance. As always, diligent patching and careful system administration can protect your virtualized network against most threats.