> More recently, I read the argument that it’s so bad that Chrome running it’s Chromium engine ought not be allowed to exist on iPhones and iPads.
The subtext is John Gruber's post 6 days ago, defending Apple's iOS lockdown: "Imagine — and this takes a lot of imagination — if Google actually shipped a version of Chrome for iOS, only for the EU, that used its own battery-eating rendering engine instead of using the energy-efficient system version of WebKit." https://daringfireball.net/2024/09/ios_continental_drift_fun...
He has a huge hard on for hating on the EU for daring to regulate Apple.
It’s been endless hit pieces about how they suck at innovation and how having Apple Intelligence is better than having Fortnite and how users having the option to install what they want on their device is the worst thing to ever happen.
Hypocritically being silent when China forces Apple to adopt RCS and has countless regulations on what they can do.
It’s almost as if he’s trying to push Apple’s agenda of why they shouldn’t be regulated anywhere. A more cynical person might even consider that he’s compensated to do so.
But then you look at his past pieces of remote work being the devil, even though he’s worked like that his whole life and the simplest explanation of him being on out of touch person seems to be the right one.
> Hypocritically being silent when China forces Apple to adopt RCS and has countless regulations on what they can do.
He recognizes China forcing Apple's hand (not the EU) when it comes to RCS, and since China is a large market, and Apple is mostly/generally a hardware company, they want to sell hardware there:
When I read that post, I didn't necessarily disagree with Gruber on the encryption front, I just think having a better base for 'cross-platform' message that didn't involve third-party software (Signal, WhatsApp (and having to use their infra)) was still worth it—even if unencrypted. Gruber wants the features of RCS but also encryption:
> Ihnatko is right, but only if you believe that carrier-based messaging should remain the baseline. I do not. And it’s also a U.S.-centric viewpoint. In most countries around the world, platforms like WhatsApp, Line, and Facebook Messenger serve that role, as the baseline “everyone has it” messaging platform — and those countries are better for it. I prefer iMessage, personally, for multiple reasons, but iMessage is fundamentally limited from serving that “everyone has it” baseline role by Apple’s decision not to ship an Android client. Eddy Cue doesn’t lose many arguments but he lost that one. All of the effort spent pushing Apple to support RCS would have been better spent pushing Apple to ship iMessage for Android. And without a supported iMessage client for Android, that role ought to go to WhatsApp, not RCS. WhatsApp is free, secure, and works equally well on all phones.
> Meta knows this, and clearly smells the opportunity. Does Apple?
* Ibid
Google's E2E RCS is a proprietary extension, and I'm not sure how telcos can implement E2E given things like CALEA in the US, so E2E may be stuck in the realm of the non-telco (unlike SMS and RCS, which are telco-run).
The “USB-C invented by Apple” legend was even presented as fact on Wikipedia for a good while. Someone fixed it a few months ago. The source, of course, was Gruber and some “little birdies”. Ironically, the referenced 9to5mac article goes on to look at specs/press releases and conclude that there’s no evidence to his claim, in fact, the credit mostly goes to Intel and TI.
As to be expected by anyone on Apple platform since the glory old days, instead of the folks that only know Apple ecosystem as a Linux alternative for UNIX laptops.
He's a total bootlicker shill. His takes are often dishonest and ignorant. He's not officially an Apple employee but he's effectively "second party". His access to insiders is predicated on his subservience. He doesn't get an annual WWDC stage interview with Craig Federighi without bending the knee.
That said, I still read Daring Fireball every week. His takes may be super biased and wrong, but it's still useful to read things from that perspective.
People talking themselves into violently agreeing that yes, being able to run Game Boy emulators and Firefox on their iPhones is unequivocally bad is something to behold and shows how far in space and time the RDF really reaches.
Yeah, there's lots of half-fact nonsense going around in this space.
But really that's the real takeaway here: the reason that "everyone knows" something that was false[1], because no one actually cares about the facts. Modern laptops do extremely well with battery life, Macbooks best among them, and frankly no one is away from a charger for that kind of period in the modern world. You know the thing will always have power, so you never bother to notice or measure what might affect it.
But you still want to argue anyway, if for no better reason than justifying the $3k you dropped on the device, so... "Chrome hurts battery life!" becomes a shibboleth denoting your membership in the right subculture. It doesn't have to be true to do its job.
[1] And, yes, there is a direct analogy to be made here about the current US political debate about immigration. I won't elaborate but I'm sure people see it.
What if—I realize this is crazy but hear me out—Chrome actually consumed more power at some point in the past? Why would Google claim that they fixed Chrome if it was never broken?
Well... did it? Again there was a hypothesis, and a measurement that disproved it. It's not very good science to say "well, the hypothesis may still have been right on this older system that wasn't measured". What you're applying is essentially conspiracy logic: you can't validate an incorrect statement by pretending that it might have been true "at some point in the past".
Remember “Chrome is bad”? (As proven by the empirical method of “I deleted a bunch of Google named things and now not only does my laptop run cooler, my laundry uses less water as well!”)
The heaviest parts of the test were using Google sites. Google has been caught in the past letting their sites run worse on Safari than Chrome. I'd really like to see this test done without having a single Google property be involved.
Agree with this. As soon as it became apparent a large proportion of battery usage would be dominated by the YouTube activity, I became suspicious. It's not just Safari where we've seen Google playing dirty tricks either. Back in 2018 it was this:
> YouTube page load is 5x slower in Firefox and Edge than in Chrome because YouTube's Polymer redesign relies on the deprecated Shadow DOM v0 API only implemented in Chrome
Of course Google is using their own browser to showcase whatever new shit they have developed, or they expand Chrome when one of their properties needs some sort of new feature. It's been quite a while that there was innovation in the browser scene from anyone but Chrome... to the contrary, over the last years a lot either gave up entirely or went under Chromium. Including Microsoft.
>> YouTube accounted for 8.5% of total TV viewing in May, while Netflix was a close second at 7.9%.
Assuming that's true, I still don't think it matters that much.
I expect that a very large proportion of Netflix that's watched is watched on a smart TV. Much larger than the proportion of Youtube that's watched on a TV.
Interesting! Almost all my video watching is on a roku box attached to my TV. I may watch a tiny bit more Youtube than Netflix on my PC or phone, just because people send me links that way, but if the videos they link me are more than a minute or two, usually I'm saving them to "watch later" and catching them on the TV box.
I'd have assumed most people's proportions were similar and that therefore the proportions for the two services were similar; that is to say, for people who use both kinds of devices to watch video, I wouldn't have expected a big difference based on the specific service.
Do you have a source that suggests that people who use both kinds of devices watch a larger proportion of Netflix on the TV but a larger proportion of Youtube on PC?
This article isn't about streaming activity in general, it's about the Nielsen ratings, which as far as I can tell wouldn't include views on creators like MrBeast, just proper 'television'.
A follow-up test to investigate whether Google sites are more power-hungry than similar non-Google sites does not have to be representative of general web browsing patterns, because the original test already addressed that question.
I know many people do watch YouTube videos but indeed, when a link leads to a YouTube video I don't follow it (and nobody ever just sends me links to YouTube).
I do turn to YouTube if I need some kind of visual guide, like for auto repair sometimes, but it's rare.
Most links to YouTube videos I find are excruciatingly slow explanations of something that could be done in two paragraphs and one screenshot, or some kind of meme that I don't find funny.
> I do turn to YouTube if I need some kind of visual guide, like for auto repair sometimes, but it's rare.
My current car has a particular enthusiast that has a personal site with detailed text-with-photographs guides to most of the common repairs, and it's awesome.
When I buy my next car I'll probably buy a repair manual to go with it.
I normally look for text alternatives, to be honest. It's not necessarily a bias against YouTube, but more video-based content in general. If I can't find an alternative I'll skim it with subtitles on. I certainly wouldn't have it open for one and half hours in a 3 hour period though!
If a friend sends me a link to youtube, i might click on it to see what the video is, and then close it without watching it. This doesn't happen very often though. If someone gives me a timestamped link and tells me to watch 30s at that spot i'll humor them, but that's about my limit.
That perfectly describes my behaviour. I don't care for video content. (I similarly don't open tiktok or instagram reel links.)
For a few things I've found where youtube has some specific content I'm interested in, I'll use yt-dlp to download and archive it to watch outside of youtube.com.
I simply cannot remember the last time I watched 90 minutes of YouTube in a 3 hour period. I don't think that activity is representative of typical usage at all!
If I look online for stats relating to use of the YouTube app in the UK, it's 20 hours a month. If that's distributed equally across the month, that's definitely not 1.5 hours in any one sitting.
"YouTube", like "Twitter", has never been one thing, one culture. There's a zillion long tail creators making all kinds of unmonetized stuff because they want to.
> Google has been caught in the past letting their sites run worse on Safari than Chrome
Is that... actually true? Or is this just a way of spinning something like "gsheets used a chrome-only extension before it was standardized". Has there been coverage of divergent power draw between chrome and other browsers on google sites?
It's pure apologism. The only thing Google has been shown to do is not test on browsers that have lower market share (which I personally don't like as a mobile Firefox user) or to have fewer features on browsers that don't support all the APIs, which would presumably result in longer battery life.
Assuming the Mac battery reporting is accurate, even running tests at different charge levels, seems specious to me. In my experience, it doesn’t tend to be. I don’t think it’d be biased to one browser over another in that way, but I don’t for a second believe that it can be used to make a statement like “Chrome used 17% of my battery life over 3 hours, then Safari used 18%.”
I think it would be much more interesting to put together ~40 hours worth of testing similar to what the author did, then run it with Safari until the battery dies, charge the machine for X hours (where X is the amount of time the battery takes to report 100% plus some margin) then run it with Chrome until the battery dies. Repeat as many times as you think necessary.
That would take battery percentage remaining reporting out of any load bearing place, which I believe is absolutely necessary here.
As I read it, the author performed half the tests with Chrome followed by Safari, and the other half with Safari followed by Chrome, which should have eliminated any first-order bias. Though I agree that the magnitudes of the numbers are aren't really meaningful, short of doing a full 100%-to-0%-SoC discharge with each browser.
I would be curious to see results with Firefox as well. I like to see people testing assumptions. I agree with the author’s primary point- it’s likely highly dependent on what tasks you are doing with the browser. The results are still interesting nonetheless.
One other potential area of variability could come from browser extensions - I imagine that users who compare browser power performance are more technical than the median user, and are more likely to run browser extensions (e.g. ad blockers, etc).
Given Chrome has a larger and more extensive collection of extensions, perhaps users who see differences are running more browser extensions in their Chrome installation, which impacts on performance/power usage?
Certainly interesting to see these assumptions put to the test though, and get some data around them. While it looks like it may have fallen behind again a little, I noticed Firefox's browsing performance on Speedometer caught up for a while, contrary to what I had thought/assumed.
I think one big issue is that there are OS specific APIs on Windows and Mac that allow you to only redraw the pixels/layers of the web browser window that have changed, with the unchanged portion retained in memory and recomposited.
We've seen versions of Safari and Edge that leverage those sorts of APIs to deliver better battery life on their respective platforms, but if you are writing a cross platform browser you may not be willing to do extra work for each individual platform.
Several years ago, Firefox adopted the solution Chrome used of dividing the web browser window into large-ish sections, so you could at least skip redrawing sections that had not changed, but that still leaves you doing unnecessary work, just less of it.
Prior to that, the battery drain using Firefox vs Safari on newer machines with higher resolution displays was very noticeable.
We're talking about using features exposed by the OS (the Core Animation layers API in this example) to avoid doing unnecessary work to save battery life.
Any browser could adopt each individual platform's proprietary graphics APIs, but using the same API everywhere (traditionally OpenGL) is less work.
Any browser engine is free to do the work to use per-platform APIs. If your argument is that cross-platform browser engines won't do the work to optimize with platform-specific APIs but that Safari will, WebKit is a cross-platform browser engine and so it adopting CoreAnimation on Apple platforms in addition to supporting other platforms is exactly the same amount of work that it would take for Blink or Gecko to do the same thing.
For my use case, it's quite obvious what's draining the battery: addons. Every trick and feature meant to preserve privacy and to remove visual trash will significantly impact load times, responsiveness, and battery life.
It's all worth it to me, but there's no doubt a web without tracking and ads would easily double my battery life.
It seems obvious to me that uBlock Origin uses less battery than an autoplaying video in the corner of every website you visit. So on balance I don't think it's fair to say the addons are the cause.
While there will be extensions which increase power consumption the opposite is true for the most used category, that being content blockers. A well-tuned uBlock Origin will cut down radically on the number of requests performed per page, the amount of CPU time wasted on non-essential Javascript, the amount of GPU render time wasted on presenting those horrid moving monstrosities called ads and the amount of energy wasted by the user while he waits for the damn page to stop loading.
Never, ever venture out on the web without a content blocker.
When running a browser performance benchmark, generally not - the ad block extension adds an overhead to the page. I saw this when experimenting with Orion Browser on Mac, which uses the Webkit engine, but adds support for many Firefox and Chrome web extension APIs.
In experimenting with that, I noticed that enabling extensions and using many extensions during benchmarks could easily impact on scores. Even just an ad blocker like uBO had a measurable impact on a benchmark, from my recollection.
Orion is using a web engine that was never really designed to support what they are trying to make it do. Obviously it's just code in the end but a lot of the changes will be made for convenience rather than in ways that make sense if you look at WebKit holistically, because the team is just too small to actually do that. So it is natural that performance suffers.
Depends on what sites you visit, but hooking every single HTTP call and HTML DOM isn't cheap. I often find websites with ads to be just as fast as websites without them, just more cluttered, unusable, and more of a privacy nightmare. The uBO overhead does seem to impact CPU usage, though.
I'll gladly pay the 15 minutes of battery power spent on filtering out that trash.
My question is why use Safari, other than it being preinstalled? It feels barebones. There must be some reason most Mac users are go out of their way to avoid Safari. ~70% of Mac users use chrome or chromium browsers, versus only 19% who use safari, and 3% who still use firefox [I searched "mac browser market share" and clicked the first link]. My guess is it's the lack of good extensions. People spend a lot of time in the browser, so lack of customizability is frustrating.
Because it’s barebones. Most of the time I’m on my Mac I use “regular” apps; I never really used web apps much (on my Mac nor on my OpenBSD/linux machines)
history/bookmarks/tabs/passwords is easily synced via Safari > File > Import From > Google Chrome/Firefox and safari can log into google accounts just as easily as chrome can.
I guess most people dont follow browser development anymore.
We are now in 2024.
Perhaps the peak of Chrome complaining battery drain was something in between 2018 - 2020. It also happens to be the peak of Safari is the new IE with so many web features missing and bugs unresolved. Both are correct to a certain degree and have been the case for many years before it reached what could be described as a PR crisis.
Since then Safari had twice if not more features and bug fix than usual in the next few Safari releases. While Chrome worked on multi tab memory usage reduction, and efficiency. At the same time Firefox just went into polishing mode because a lot of the efficiency work already came from Servo, E10s and Memshrink over the past 10 years.
In multi tab usage ( ~50 to 80 ) Chrome is already better than Safari simply because Safari still dont consider lots of Tabs on macOS as one of their usage scenario. And Chrome being better for that for at least 2 years. For 7000 tabs it is still better to use Firefox. I guess that is what I called battle tested. My record was only around 2000 Tabs, and that was 10 years ago.
As a matter of fact, I would consider current Firefox ( 130 ) to be the best browser on the market, single tab or multi tab usage. Being the fastest and most efficient. The last time this happened was in pre Chrome IE 7 era. ( As one could argue IE 6 was better than Firefox )
The most serious problem with the web IMO the endless scope creep. Maybe, idk, set a clear goal, work towards it, and then just stop and consider the web "done"?
I hate it so much that almost everything in the IT industry is a process. It's necessarily unending. No one is shipping finished products anymore. Everything is in eternal beta. The web standards are the most egregious example of this.
I think what we need is not an HTML6, but rather HTML4-ish and something separate. I want web pages and web apps to be distinguishable, and not have every newspaper article and blog post hosted inside of a heavyweight application framework that either breaks or re-implements all kinds of basic user agent functionality. Simple pages should be implemented in a simple, constrained technology stack and heavyweight web apps should be something the end user opts-in to using.
> I want web pages and web apps to be distinguishable, and not have every newspaper article and blog post hosted inside of a heavyweight application framework that either breaks or re-implements all kinds of basic user agent functionality. Simple pages should be implemented in a simple, constrained technology stack and heavyweight web apps should be something the end user opts-in to using.
I don't see people putting that genie back in the bottle. Not when there are so many designers that override the scrollbar for aesthetics.
For the record, I'm with you - I think it would be great if most websites ran on gopher 2.0 that used markdown for its syntax. I just don't think it'll happen.
I don't see a way to make it happen, either. But it seems crazy to me that browsers can be adding so much OS-like functionality that's a risk to security or privacy without even attempting to bundle them under a "web app" permission to simplify the user experience of opting in to allowing a domain to do all the things a simple web page doesn't need.
And it's absurd that there's seemingly no way for a Chrome user—even with extensions—to prevent a web page from restyling scrollbars into something unrecognizable.
> The web standards are the most egregious example of this.
Of course they are, the end game is browsers being the OS. But it's way too risky to give web sites direct unfettered access to the computer (ActiveX, remember the time and the many ways you could be fucked by that?), so a loooot of stuff has to be built as abstractions. WebGL/WebGPU, WebUSB, WebSerial, WebBluetooth... the only thing I'm still pissed about that didn't make the cut is WebSQL - IndexedDB and localStorage just suck in comparison to a proper SQL shell.
Take a look at how Apple killed Flash: they made a product that people flocked to, that didn't support Flash. This put vendors that didn't use Flash at an advantage compared to those that did, which started a vicious cycle for Flash.
I don't want to get rid of JS entirely, I want to discourage building apps with it. I want it to be the macro language for hypertext like it was originally intended.
> Cards on the table, I’m an Arc guy on the desktop
I have used Arc on my M1 MBP a couple of times but don't have enough usage to say anything about its performance. What are its advantages over Safari or Chrome?
It's hard to explain in a comment, but I started off very skeptical of "skinned chromes" like vivaldi, brave, etc before trying and loving arc. In short, I love it's general UI paradigm and it's customizability. In no particular order: command bar as a replacement for the nav bar, 1st class horizontal tabs, great keyboard shortcut support, great multi-profile support and automation, split screen browsing, tab and download management and more. It has some AI features I don't care about, but they are easy to turn off,
I feel much more efficient and fewer barriers to browse the web in a way I grok with Arc. I'd say it's worth an install to try.
Are there any specific shortcomings you think render this test "useless"? It doesn't seem at all unreasonable to extrapolate from a 3-hour test repeated six times to estimate power usage over an 8+ hour span. Or is the "small sample" you refer to the 20-minute inner loop?
I don't agree that it's useless either, but I would expect the test macro to match what I do while working: Opening multiple PRs in new tabs all the time, cloud vendor portal tabs, looking through logs and metrics, scrolling (more than watching) some YT video, lots of scrolling and switching of the 15 open tabs, navigating enterprise software portals that are badly written web apps (looking at you, Salesforce)... this is a normal workflow for me and I feel that it questions the crazy in the crazy macro. It also questions the sanity in me, but that's beside the point.
Having a good amount of background processes running (terminal stuff, IDE, Slack, Google Drive, OneDrive, VPN etc. is probably hard to test without introducing more variance, but I can imagine they could play a part. Especially on an 8 GB unified memory MacBook.
The subtext is John Gruber's post 6 days ago, defending Apple's iOS lockdown: "Imagine — and this takes a lot of imagination — if Google actually shipped a version of Chrome for iOS, only for the EU, that used its own battery-eating rendering engine instead of using the energy-efficient system version of WebKit." https://daringfireball.net/2024/09/ios_continental_drift_fun...