Virtualization – Containing IT Costs

Whenever a company wishes to run a new application or run a new operating system, they end up buying a new server to go with it, although they are aware that servers usually run at 20-30% their actual capacity.

They ensure that each application has a dedicated server. The reason they do this is because there are times when all the resources of a server are required to run an application; and they avoid using multiple operating systems or applications on one server, keeping this need in mind.

People realize that this system of maintaining servers is inefficient due to the need for huge data centers and also because of the heavy power consumption.

There seems to be an answer to this problem in “virtualization.” This hot new technology is seeing exponential growth throughout the world. Factually speaking, this technology is not new and has been used for mainframes for a long time now, thanks to IBM. This is the reason why in mainframes, the utilization of capacity is almost 90% when compared to just 20 to 30% in other servers.

It is all very well to talk about “virtualization,” but what exactly is it?

Virtualization is based on a software program known as “hypervisor.” This software creates many virtual servers from one single physical server. It allows multiple operating systems to share a single processor.

According to the Director of Hewlett-Packard Asia Pacific, “Each OS sees each virtual machine as a separate physical server.”

Although, each operating system appears to have its own processor, network, memory, storage etc., it is actually the hypervisor that controls the real processor, allocating the resources required for each operating system.

If certain processors need more computing power and memory, the hypervisor can provide the required resources dynamically, by diverting these resources from the other processors that may not need them at that point of time.

Since virtualization is based on a software (hypervisor,) it can do lots of things like transferring the work load from a hot server to a cooler one, in order to save on cooling costs. All this happens without the end-user noticing any change whatsoever.

Analysts say that this software will help companies save a lot on power and cooling costs, while providing more flexibility and increasing utilization power.

IT customers are leaning towards virtualization for testing purposes, as they can use multiple operating systems like Windows and Linux at the same time.

Recent studies show that more than 75% of companies in the world have gone ahead with virtualization or will soon adopt this solution within the next year. Information Technology spending on virtualization is estimated to touch $15 billion by 2009. This shows the impact it has created among users.

This new technology has been adopted not only by large companies but has also reached the masses, small businesses and home office users.

Companies like Microsoft, that provide hypervisors are today moving into virtualization 2.0, going far ahead and beyond single server virtualization to advanced systems that are capable of far higher scalability, disaster recovery and high availability.

With so much going on in the world of computers, analysts are already looking ahead at automation of the whole virtualization process, such as moving work from one virtual server to another with ease.

Virtualization is a great concept and will help customers handle growth and scale up their operations, without having to worry about planning or preparation prior to implementation, in terms of computing power and servers.

Analysts estimate that the day is not far away when there will be virtualized third-party data centers offering companies the required storage and computing power when they need it.

Join the discussion