One of the key motivations for pursuing energy efficiency initiatives in existing data centers is to avoid the capital expense of developing new facilities - a point that proponents of the myriad measures and technologies that improve efficiency sometimes have a difficult time quantifying.
So, just how much does new data center capacity cost? Fortuitously, capital costs came up in several presentations at the Uptime Symposium in New York last week, and I gut-checked that with a highly-regarded consultant who advises clients on where to build "utility scale" centers.
I had always heard the rule of thumb that it costs $10 million a megawatt to build a new data center, with some variation based on tier (the reliability index administered by Uptime).
My fellow consultant said that that is pretty much at the low end of the range, with some centers coming in as high as $20 million per MW.
I don't have clarity on whether that is based on total load, or is referring solely to IT equipment capacity, though I suspect it's the latter.