Beyond PUE: The Big Picture Of Data Center Energy Efficiency

Date:
2015-01-13 18:55:13
   Author:
10Gtek
  
Tag:

To hear the buzz in the data center industry, a facility manager might assume that reducing the PUE, or power usage effectiveness, of a data center would be a fool-proof way to add value to the business, as long as the steps taken to improve PUE don’t cause a spike in the risk of downtime. And in many cases, that is true. Data centers consume a lot of energy, and improving the efficiency of the facility infrastructure can reduce the organization’s energy bill significantly. But PUE may not point to the best opportunities to add value to the overall business — or even to the best ways to cut data center energy use.

 

PUE has become a widely used metric for data center energy efficiency. But while PUE can be useful, it doesn’t tell the whole story of data center energy efficiency. “If you have servers that are running at 2 percent utilization, it doesn’t matter how good your PUE is,” says Jonathan Koomey, research fellow, Steyer-Taylor Center for Energy Policy and Finance, Stanford University. Low server utilization means that a lot of energy is being wasted by IT hardware; raising utilization can be a great way to improve overall computing efficiency. 

 

Utilization can be raised in many ways. One common strategy is virtualization, which involves the consolidation of software applications onto fewer servers and enables other servers to be turned off. The result is a reduction in energy use. Another possibility is simply to increase the utilization of some or all servers; in that case, the result would be an increase in IT energy use but also a gain in computing capacity, which may be important for the business.

 

Either way, a narrow focus on PUE would almost never lead to efforts to increase server utilization. 

 

“The focus on improving one metric like PUE distracts from the focus on the whole system,” says Koomey, an expert on data center energy use, efficiency, and management practices who teaches an online class titled “Data Center Essentials for Executives.” “Amory Lovins from the Rocky Mountain Institute says that if you optimize parts of a system, you ‘pessimize’ the whole system.”

 

The starting point for evaluating data center energy efficiency efforts is to understand that a data center produces a service, which has a business value as well as a cost, says Koomey. “Sometimes things that you might do to reduce costs will also reduce the business value — and reduce business value more than the costs are reduced,” says Koomey.

 

Consider the idea of increasing data center temperatures to reduce cooling costs. Taking that step could limit an organization’s ability to deploy more or higher density servers. “You might increase the temperature in the data center and improve your PUE, but ultimately not increase your ability to produce more computing — and actually restrict it,” says Koomey. “People need to understand it’s not just about costs. It's also about the benefit side of computing. If you do things that reduce costs on the surface, but prevent you from deploying more computing, that’s not necessarily the best thing for the company.”

 

Koomey says that business value comes from the amount of computing that a data center can produce. The goal, he says, is to “deliver compute at the lowest cost per compute cycle.” And in that context, he says, a facility department should see itself as a “cost-reducing profit center,” Koomey says.

 

“Let’s not just do things that reduce costs,” he says. “Let’s think about ways that we can expand the amount of compute that we’re doing, so that the total cost per compute cycle goes down and the total revenues per compute cycle go up.”