Virtualization is a computer environment, which allows multiple “virtual machines” to reside and run concurrently on a single computer hardware platform.
A virtual machine is similar to a server, but instead of additional hardware it is software. In essence, it is the ability to separate hardware from a single operating system, thus providing better IT resource utilization, greater application flexibility and hardware independence.
By allowing several virtual machines with multiple operating systems to run in isolation — basically adjacent to each other on the same physical hardware — each individual “virtual machine” has in essence its own set of virtual hardware. The virtual operating system detects a controlled, normal, consistent group of hardware regardless of the tangible hardware components.
In addition, virtualization will control the CPU usage, memory and storage of the “virtual machines” and allow one operating system to migrate from one machine to another. These “virtual machines” are encompassed into files; these files are quickly saved, copied and migrated to another “virtual machine” thus providing zero downtime maintenance and controlled workload consolidation.
Saving Time and Cutting Down Costs Are Just the Beginning
Horizontal scaling or decentralization of data centers over the past several years have been mission intensive because centralized servers were viewed as too expensive to acquire and maintain. Subsequently, applications were moved from a large shared server to their own individual machine.
Although, decentralization aided in the constant maintenance of each application and improved security by isolating one system from another on the network, it also increased the expense of power consumption, large footprint requirements and higher management efforts.
According to xensource.com “… these areas have been known to account for up to $10,000 in annual maintenance cost per machine and decrease the efficiency of each machine by 85% due to idle time.”
The long and the short of it is: virtualization is a mid-point between centralized and decentralized environments. You no longer need to purchase a separate piece of hardware for one application. If each application is provided its own operating environment on a single piece of hardware you reap the benefits of security and stability, while taking advantage of the hardware resource.
Also, virtual machines are isolated from the host; so if one virtual machine crashes, all the other environments remain unaffected. Data does not leak across virtual machines and applications can communicate, provided there is a configured network connection. The virtual machine is saved as a single entity or file, which provides easy backup, copies and moves.
Why Use Virtualization? Let Me Count The Ways …
Since virtualization detangles the operating system from the hardware, there are several important reasons to take into account as to why you would want to use virtualization:
- Data center consolidation and decreased power consumption
- Simplified disaster recovery solutions
- The ability to run Windows, Solaris, Linux and Netware operating systems and applications concurrently on the same server
- Increased CPU utilization from 5-15% to 60-80%
- The ability to move a “virtual machine” from one physical server to another without reconfiguring, which is beneficial when migrating to new hardware when the existing hardware is out-of-date or just fails
- The isolation of each “virtual machine” provides better security by isolating one system from another on the network; if one “virtual machine” crashes it does not affect the other environments
- The ability to capture (take a snapshot) the entire state of a “virtual machine” and rollback to that configuration, this is ideal for testing and training environments
- The ability to obtain centralized management of IT infrastructure
- A “virtual machine” can run on any x86 server
- It can access all physical host hardware
- Re-host legacy operating systems, Windows NT server 4.0 and Windows 2000 on new hardware and operating system
- The ability to designate multiple “virtual machines” as a team where administrators can power on and off, suspend or resume as a single object
- Provides the ability to simulate hardware; it can mount an ISO file as a CD-ROM and .vmdk files as hard disks
- It can configure network adaptor drivers to use NAT through the host machine as opposed to bridging which would require an IP address for each machine on the network
- Allow the testing of live CD’s without first burning them onto disks or having to reboot the computer
The time for virtualization has come and the possibilities are endless.
From server consolidation and containment, minimized downtime, ease of recovery, elegant solutions to many security problems — in a testing and development environment each developer can have their own virtual machine, isolated from other developers’ codes and production environments.
For the new age data centers virtualization is definitely to be the way to go.