Containing the cost of powering the data center is one of the chief priorities for many IT and facilities professionals. But determining whether a data center is a paragon of green virtue or an unrepentant gas guzzler is trickier than it might seem at first glance.
That's why the Green Grid, a nonprofit trade group led by IT vendors, devised Power Usage Effectiveness (PUE), which compares the total power used in a facility with the power devoted specifically to IT equipment. But as with almost any new standard, there is confusion within the industry about how to measure PUE and in what context it is useful.
PUE measures the number of watts needed power and cool IT gear, taking into account all power dedicated solely to the data center, an important distinction in mixed-use buildings that house both IT equipment and office space.
"A PUE of 2.0 indicates that for every watt of IT power, an additional watt is consumed to cool and distribute power to the IT equipment," according to a Google research paper.
The Uptime Institute measures PUE in a fee-based service for its members, and found an average rating of 2.0 to 2.5. But Kenneth Brill, the institute's founder and executive director, is wary of businesses misinterpreting the measurements.
Say an IT shop virtualizes its servers, reducing its IT power consumption by 20%. Few observers would consider this a bad thing, but it might result in a higher PUE, angering the facility director's boss, Brill says.
"Is it the facility manager's fault that the IT load is light?" Brill asks.
The Green Grid's PUE is one of numerous attempts to analyze IT power usage and efficiency. The Uptime Institute has proposed alternative measures, including Corporate Average Datacenter Efficiency (CADE), which takes into account not only power usage but also a company's CPU utilization rates and the effectiveness with which the data center's IT equipment transforms energy into "useful" work.