I'm a firm believer that focusing on the efficiency of IT equipment, while a meaningful long-term project, shouldn't consume the majority of industry and regulatory attention. The fact that most IT equipment (certainly servers and some data storage technologies) operates at a ridiculously low average utilization rate represents the far bigger opportunity - driving energy conservation in data centers and IT infrastructure should come before energy efficiency considerations.
Even first-tier, sophisticated users of servers, with only a limited set of IT processes (e.g. internet search companies and etailers) only manage utilization rates in the twenty percent range by their own admission, and every other enterprise user would be an industry-leader if they pushed above ten percent.
So what is the problem? After all, we're talking about turning off (or "down") equipment when it isn't being used...just like turning off your car when you park it in the driveway.
The industry will point to an IT management ethic that values redundancy and availability above all else, so absent some sophisticated new approach, once a server gets plugged in, it is on for life.