Resolved: We Need More Realistic Notebook Battery-Life Claims
One of the best features of Apple's new MacBook Airs hasn't gotten all that much attention. Here's Steve Jobs announcing it last week:
That's the 13.3? Air Jobs is talking about -- later on at the event, he introduced the 11.6? version and said it got up to five hours, again with the tougher tests.
I've been using the 11.6? MacBook Air over the past week and a half, and judging from my experience, Apple's estimate of five hours is indeed realistic. It's about what I'm getting -- which is a pleasant surprise considering that I'm used to discounting the battery life claims made by laptop manufacturers (including Apple) by anywhere from thirty to sixty percent. The Air's five hours remind me more of the ten-hour claim Apple makes for the iPad; it seems fair.
Now, I'm not saying that typical battery claims are lies, and I know that the words "up to" are meant to indicate that that the numbers aren't promises. They're way closer to best-case scenarios than Harry-case ones -- what you may get if you don't push the notebook very hard. And even then, they feel high in many cases. Last year, AMD's Pat Moorhead guestblogged here and noted that PC manufacturers' battery tests tend to involving cranking screen brightness way, way down. But I do that myself -- and turn off features like Bluetooth, and opt for "power saver" modes -- and I still generally fall short of the claims.
I don't seem to be a freaky exception, either. I did a totally unscientific survey of my Twitter followers, asking them if they ever get remotely the battery life that manufacturers claim. Most of them say the claims don't jibe with their personal realities:
I think we all know that there are claimed battery lives and actual ones, and that they often have nothing to do with each other. (One of my favorite computer merchants, Dynamism, is one of the few that acknowledges this: Its laptop listings generally mention both the vendor estimate and a real-world one.)
Wouldn't both notebook manufacturers and notebook shoppers be better served if the industry adopted tests similar to whatever Apple is now using? The current Fantasyland approach may help sell laptops, but it also ensures that a high percentage of buyers will face disappointment the moment they start using their new machine. It leads to consumers assuming (reasonably) that they can't trust the companies they buy computers from. That can't be good for anyone.
Barring a move to one more realistic figure, how about always quoting a range rather than one magic number? It's clear that performance will vary wildly depending on how a computer is used, so wouldn't it be logical to say that a given machine may get anything from four hours to twelve hours?