Science & Technology - Posted by Daniel Oppenheimer-U. Texas on Tuesday, January 24, 2012 17:47 - 0 Comments
Battery drain may influence app design
U. TEXAS-AUSTIN (US) — The first systematic power profiles of microprocessors could help lower the energy consumption of small and large devices, say researchers.
The results may point the way to how companies like Google, Apple, Intel, and Microsoft can make software and hardware that will lower the energy costs of very small and very large devices.
“The less power cell phones draw, the longer the battery will last,” says Kathryn McKinley, professor of computer science at the University of Texas at Austin.
Straight from the Source
“For companies like Google and Microsoft, which run these enormous data centers, there is a big incentive to find ways to be more power efficient. More and more of the money they’re spending isn’t going toward buying the hardware, but toward the power the datacenters draw.”
McKinley says that without detailed power profiles of how microprocessors function with different software and different chip architectures, companies are limited in terms of how well they can optimize for energy use.
The study she conducted with Stephen M. Blackburn of the Australian National University and their graduate students is the first to systematically measure and analyze application power, performance, and energy on a wide variety of hardware.
“We did some measurements that no one else had done before,” McKinley says. “We showed that different software, and different classes of software, have really different power usage.”
McKinley says that such an analysis has become necessary as both the culture and the technologies of computing have shifted over the past decade.
Energy efficiency has become a greater priority for consumers, manufacturers, and governments because the shrinking of processor technology has stopped yielding exponential gains in power and performance.
The result of these shifts is that hardware and software designers have to take into account tradeoffs between performance and power in a way they did not 10 years ago.
“Say you want to get an application on your phone that’s GPS-based,” McKinley says, “in terms of energy, the GPS is one of the most expensive functions on your phone. A bad algorithm might ping your GPS far more than is necessary for the application to function well.
“If the application writer could analyze the power profile, they would be motivated to write an algorithm that pings it half as often to save energy without compromising functionality.”
McKinley believes that the future of software and hardware design is one in which power profiles become a consideration at every stage of the process.
Intel, for instance, has just released a chip with an exposed power meter, so that software developers can access some information about the power profiles of their products when run on that chip. McKinley expects that future generations of chips will expose even more fine-grained information about power use.
Software developers like Microsoft (where McKinley is spending the next year, while taking a leave from the university) are already using what information they have to inform their designs. And device manufacturers are testing out different architectures for their phones or tablets that optimize for power use.
McKinley says that even consumers might get information about how much power a given app on their smart phone is going to draw before deciding whether to install it or not.
“In the past, we optimized only for performance,” she says. “If you were picking between two software algorithms, or chips, or devices, you picked the faster one. You didn’t worry about how much power it was drawing from the wall socket.
“There are still many situations today—for example, if you are making software for stock market traders—where speed is going to be the only consideration. But there are a lot of other areas where you really want to consider the power usage.”
More news from University of Texas at Austin: www.utexas.edu/news/