Cost reduction is always mentioned as one of the benefits of virtualization, but there are little case studies featuring actual numbers available to prove it. That’s why this post on Server Virtualization Blog by Eric Siebert was a refreshing read. Eric writes:

In today’s world the cost of just about everything has been on the rise. Fuel costs in particular have a ripple effect on just about everything we buy which also affects computers. That’s why virtualization is a great way to offset those increased costs. Providing power and cooling to a data center can be a very big expense, virtualizing servers can dramatically reduce this cost. PlateSpin provides a nice power savings calculator on their website. If we plug in the following numbers:

- 200 physical servers
- average usage of 750 watts per server
- average processor utilization of 10% before virtualization
- target processor utilization of 60% after virtualization
The average power and cooling savings a year comes out to $219,000 with a consolidation ratio of 5:1 based on a cost per kilowatt hour of 10 cents. As the cost of power increases the savings become even greater, at 12 cents the cost savings become $262,800 per year and at 15 cents the cost savings become $328,500 per year.

Of course savings will vary based on a number of factors including how well utilized your physical servers are before virtualization, your consolidation ratio which can sometimes be as high as 15:1 and also your location. Different parts of the country average different costs per kilowatt hour, California and New York tend to be the highest at 12 – 15 cents per kilowatt hour where Idaho and Wyoming are the cheapest at about 5 cents per kilowatt hour. Power costs tend to rise a lot more then they go down so the argument for virtualization from a cost perspective becomes much easier when you factor in the potential savings.