At last week's Microsoft Worldwide Partner Conference in Houston, the company's CEO Steve Ballmer made this remark during his opening keynote speech:
We have something over a million servers in our data center infrastructure. Google is bigger than we are. Amazon is a little bit smaller. You get Yahoo! and Facebook, and then everybody else is 100,000 units probably or less.
That's a lot of processors and software running nearly 24/7 but what are the costs to build and operate that much hardware? This week, former Microsoft team member, and current vice-president of Amazon, James Hamilton tried to dissect just how much power it takes to run all those servers and how much money Microsoft needed to build and house the servers in the first place
In his personal blog, Hamilton estimated that each server run about 150 to 300W. Assuming that number is correct, he estimated that the total amount of power to run all of the servers comes to 300MW, or three hundred million watts. If those numbers are correct, then Microsoft uses 2,629,743 MWh or 2.6 terawatt hours of power per year; that's enough to power 230,000 homes in the US.
Assuming an average cost of $2,000 for each server, Microsoft has spent about $2 billion just for the server hardware, according to Hamilton. If you add the cost of building the actual data center buildings, plus the cost of networking, and the total amount for all those servers comes to $4.25 billion, at least according to his figures.
While that's a lot of money, it's also possible that Microsoft could have saved some cash by buying servers in bulk. As far as the power estimates, it's also possible that the hardware and software in the servers could be optimized to use less energy. In any event, it's unlikely that Microsoft will ever break down these numbers officially.