Today's Best Tech Deals
Picked by PCWorld's Editors
Top Deals On Great Products
Picked by Techconnect's Editors
Lots of claims are made about which browser is better or worse for a device’s battery life. Can a browser really make that much of a difference? Yes indeed, but determining just how much of a difference and whether it even matters to your individual use case is the difficult part.
I began testing the question of different web browsers’ impact on battery life about two months ago, and what I’ve concluded is that there’s a lot of work to be done here.
What’s generally wrong with browsing tests
I’ve read about people using browsing as a rundown test for laptops but I have concerns about how that’s done. As you know, the internet is a dynamic living organism. What I get when I point my browser at PCWorld.com at 2:14 a.m. EDT on August 29, is going to be different than what I get on 8 a.m. on January 1.
Even trying to browse with the same laptop just minutes apart could yield quite a different experience in terms of Flash ads, embedded videos, and other dynamic elements.
That’s not even mentioning that the route the packets take to reach your screen could differ considerably moment to moment. These and other uncontrollable variables are enough to scare me off of running comparative tests using the live internet.
Enter EMBC BrowsingBench
EMBC is a small benchmarking outfit that claims its BrowsingBench test removes the variability in testing browsers. The benchmark runs on Linux from a USB key. You boot into Ubuntu on a laptop that’s connected via ethernet to a wireless router, and then connect your test laptop to that router’s Wi-Fi.
You select between page types, how long you want the test to dwell on a page, and even set the bandwidth you want simulated. The pages are stored and served by the benchmark, which means every single page and every single Flash ad is the same.
I configured BrowsingBench with a rather long “dwell” time on each page, rather than just jamming through a bunch of pages. I figured people don’t browse that way so what’s the value of it.
If you were to watch the benchmark run, it would look like a person went to a site, scrolled down maybe a third of the page, paused, scrolled another third, paused, and so on.
It’s so not perfect
As much as I think EMBC’s BrowsingBench is pretty nifty, it’s far from perfect. The test is actually designed to scale to phones, tablets, and even set-top boxes. It includes webpages that are pure mobile sites as well as the desktop versions (I selected only the desktop versions), but the pages are clearly very light workloads for a PC.
The test is also designed for single-tab browsing. That’s just not realistic today. EMBC officials tell me they’re working on heavier page loads for the next version of BrowsingBench, but you have to go with the benchmark you have, not the one you want.
That doesn’t discount the results I’ll show you here today, but you should know that they reflect a light-duty-browsing scenario.
I used the same Toshiba Radius 12 that I used for my media-player shootout. This laptop has a Core i7 Skylake CPU, 8GB of RAM, an M.2 SSD, and an Ultra HD 4K panel with 10-point touch. With its 41-watt battery, battery life is a little underwhelming. That’s to be expected though, as both 4K resolution and touch can be draining. I ran all of my tests at 155 nits, which is a reasonable brightness for an office environment where you are trying to save power.
The laptop was running Windows 10 Home with the latest updates installed prior to starting the tests. Once I updated the laptop, it stayed off the Internet to keep the OS at a consistent state.
To test the accuracy of the benchmark, I ran repeated tests in Chrome (each of which took several hours) and the results were within four minutes of each other. I used a LInksys 802.11n router for the tests, which was about two feet from the test laptop.
Remember, I began my testing about two months ago so the browser versions are what was current at that time. For example, this was started prior to Opera pushing out its power-saving-mode version. The browsers I tested include: Chrome 50, Firefox 46, Edge 13.1, Opera 37, and Internet Explorer 11. (I did try a beta of Opera 39 with its power-saving mode switched on much later but ran into an issue where pages would not load correctly.)
As Firefox and Opera do not include Flash support by default, I installed the Adobe Flash plugin for both. All the browsers were running Flash 21.0.0. The only 64-bit browser was Microsoft’s Edge. The rest were all the 32-bit versions, which is the default browser of choice even if you’re running a 64-bit OS.
This is a lot of lead up to something that’s a little anticlimactic. My testing with a “light” browser load shows that Microsoft makes the most power-efficient browser, and the most power-inefficient one.
Yes, Microsoft’s Edge 13.1 browser was clearly the winner here. I hit 385 minutes with the Edge browser, which is almost an hour more than Internet Explorer 11 lasted in browsing.
Google’s much-maligned Chrome (which has a reputation for being a power hog) pulled into second place with about half an hour less battery life than Edge. Firefox was just about as bad as Internet Explorer, and Opera was on par with Chrome.
The thing is, you were probably expecting far more dramatic results. Kinda like Microsoft’s own test that it released this week. If you didn’t catch it, Microsoft testers browsed various websites (on the open Internet apparently) while looking at the power consumption so the testers could measure power consumption. Microsoft’s tests—conducted on the open internet apparently, and using a special instrumented Surface Book—showed from 36 percent to 53 percent better battery life over the competition when browsing in Edge. In a video test, Edge edged out Opera's new power-saving mode by 17 percent, bested Firefox by 43 percent, and Chrome by a whopping 70 percent.
The company also showed all four browsers running a streamed video until they tapped out with Edge again taking the lead.
Microsoft further showed telemetry from “millions” of Windows 10 machines that it has captured, which supports its results (umm, does anyone else think it's creepy that your OS is dutifully reporting anonymous telemetry data to be used for marketing purposes?).
I don’t actually doubt Microsoft’s numbers. In fact, they reinforce my own personal experience using various browsers. Chrome “seems" to cause the battery to plummet, while Edge "seems" to sip power during use.
But that’s just my anecdotal experience and without the ability to measure it reliably, I’ll just leave it at that. And to be perfectly honest, I still use Chrome, except when I’m really trying to maximize battery life. Then I switch to Edge.
So here’s the thing. My own tests shows Edge has a clear power advantage in light browsing chores; it’s just not as dramatic as Microsoft’s own tests. But the truth is actually more complicated because our browsing habits are so different, and can change from day to day. If you play a game or use Outlook all day, you can make a pretty good guess about how each will impact battery life. A browser though is a window to the unlimited and ever-changing Internet and no one uses it the same way.
Do you sit with 10 tabs of Flash- and video-heavy webpages open all day? Or do you sit in Google Docs for eight hours? Do you park your browser on YouTube or some shady streaming website for long stretches? All three of those use cases will likely have very different effects on battery life and going by anyone’s generic “browser battery-life” figures doesn't make much sense.
Are browser benchmarks still valuable? Yes, but only to the extent that you understand the scenario being tested. For example, after doing my tests, I'm pretty confident telling you that if you're just doing very light web browsing with the screen brightness at a medium-to-dim 150 nits, Edge is the most power-efficient choice, but the other’s ain’t so bad either.