Why Laptop Battery Claims Are So Useless, And Why That Won't Soon ChangeS

It's one of those things we take for granted: official laptop battery life claims have an extremely tenuous relationship with reality. Not surprisingly, everyone's using the same tricks to conjure their silly estimates—and they don't plan on stopping.

AMD, as part of a some kind of PR campaign, is saying the culprit is a battery testing suite called MobileMark 2007:

the parameters for this test include having the screen at just 20 percent brightness, Wi-Fi turned off and no music, video, games or Web pages running. More or less, the test turns a computer into a dimly lit clock, then sees how long it can run.

That is exactly the kind of test you'd have to run to hit manufacturers' 50-100%-inflated figures, and the perceived ubiquity of the test gives it an air of authority—or at least respectability—within the industry. Using anything more honest would put a manufacturer at a competitive disadvantage.

This is where the story lapses into accusations of subterfuge: AMD says these tests don't just benefit laptop manufacturers in general—they're unfairly biased towards Intel, whose chips are optimized for these less-than-realistic scenarios. It's easy to see how this would be upsetting, but it's not clear what AMD can really do. They're proposing a system by which manufacturers show two battery ratings—one the shows a theoretical, low-use maximum, and one that reflects heavy use. (To their credit, Sony already does something like that). Intel tacitly admits the practice, but is predictably standoffish about it, while AMD doesn't inspire much confidence:

By 2010 or 2011, something might show up from a consortium that could be used. It takes two to three years."

Well, thanks for trying, I guess! [NYT]