HP Researcher: Power Efficiency Has a Long Way to Go

Today's Best Tech Deals

Picked by PCWorld's Editors

Top Deals On Great Products

Picked by Techconnect's Editors

The IT and consumer electronics industries have a long way to travel in terms of improving power efficiency of electronic systems and devices, a Hewlett-Packard researcher argues in the latest issue of the Association for Computing Machinery's flagship journal.

"We've made tremendous progress over the past several years," said HP Labs distinguished technologist Parthasarathy Ranganathan, in an interview with IDG. "But when you look at the fundamental limits," the industry could make quantum leaps in efficiency, he said.

How much so? Extending the arguments of famed physicist Richard Feynman into the digital age, Ranganathan estimated that, based on "the physical limits on the power costs to information transfer," the power used by a single handheld could, in theory, power a billion desktop computer processors.

Ranganathan admitted that his estimates are based on a theoretical limit, one that probably will never be reached. But even if we were to increase power efficiency by a factor of 10, the amount of power that could be saved would be remarkable, he contended.

The article, "Recipe for Efficiency: Principles of Power-Aware Computing," offers a list of techniques that Ranganathan and his peers have developed over the years for improve power efficiency, albeit not to the levels of Feynman's prophecy.

Ranganathan is a go-to person at HP when it comes to power management, having worked on power efficiency issues on everything from handheld devices to supercomputers over the past 10 years.

"Fundamentally, the work we have been doing is the same thing [regardless the platform]. There are a bunch of common techniques that keep coming up over and over," he said.

Ranganathan said he recognized the wide applicability of these ideas when some HP engineers approached him to help them improve the battery life of a digital camera they were designing. By day, Ranganathan is the project manager of HP's Exascale Data Center project, which is seeking to develop next-generation data center technologies. By applying techniques that he had developed for data center work, he said he was able to suggest changes that led to notable gains in the camera's battery life.

The ACM journal article lists a number of ways in which power is wasted, thanks to design and implementation choices.

For instance, multi-use devices, such as smartphones, almost always use more energy than single-use devices, such as MP3 players. "By definition, general-purpose systems must be designed to provide good performance for a multitude of different applications. This requirement results in designers using the 'union' of maximum requirements of all application classes," Ranganathan wrote.

Another problem: over-provisioning power. "Many power supplies are optimized for peak conversion efficiency at high loads. When these systems are operated at low loads, the efficiency of conversion can drop dramatically, leading to power inefficiencies," he wrote.

Some of the potential solutions that Ranganathan offers are well-known in the field of component design, while others are more obscure, he admitted.

For instance, one technique that could be more widely used, in Ranganathan's view, is "spending energy to save energy." The idea is to introduce new capabilities that lower overall energy usage, even if the capabilities themselves require additional energy to run.

One example would be a program that periodically scans memory of servers to reclaim portions that have been reserved by programs but are no longer being used. Reclaim enough memory and you can consolidate operations on a fewer number of servers, he said.

Another idea that Ranganathan suggested is to examine power consumption of a system by examining the entire ensemble of components, rather than focusing on the consumption of individual units. By doing so, you can get a more realistic picture of overall energy consumption.

"When you look at large collections of systems, such as a data center with thousands of blade servers, you have extra degrees of freedom," he said. Not all the components will use all their allotted power all the time.

As an example of how this could save energy, consider how data center designers do their work, Ranganathan said.

Designers use a measurement called Power Usage Effectiveness (PUE) to help determine how much cooling equipment to run. PUE is the ratio of the total energy consumed by a data center to the amount of energy used to power the IT equipment. Designers try to make the PUE of their systems as low as possible. Using a more realistic estimate of IT power usage allows them to pick more appropriately sized cooling systems.

"Many systems are optimized for the peak-performance scenario. In the absence of designs that proportionally scale their energy with resource utilization, the result can be significant inefficiencies," Ranganathan said.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
Shop Tech Products at Amazon