Loading certificates into the phone is as easy as opening the files. The problem here is that Android apps have to opt in to loading user-imported certificates.
Chrome and many other browsers will load these certificates just fine if you install them the official way. Apps that specify they trust the user store will also load them without any issues.
The method that's now broken fails because the author is using a workaround: with root permissions, the system store can be altered, which apps do trust by default. Chrome, however, is following best practices and enforces that certificates are logged in the certificate transparency log. This isn't done for user-imported certificates for obvious reasons, but it's applied to system certificates to prevent rogue CAs from faking certificates without exposing themselves to the world.
This means the workaround no longer works, or at least not as easily. There are still workarounds to fix the workaround, like the flags the author suggests here. It was never a supported way of doing things and unsupported workarounds are bound to break at some point.
I don't know how iOS deals with certificates, I suspect it's something sensible when the normal API is used (opt-out of user certificates, that is). However, apps like social media and messengers will often include certificate pinning that is impossible to get around without jailbreaks + modifying runtime code through tools like Frida. They include the hash of their (intermediary) certificates in the application itself and validate that the chain is signed by a valid certificate with that specific hash. That way, a malicious certificate authority can give out a "valid" certificate that's useless for MitM-ing your app's users!
Frankly, it doesn't matter why exactly they've broken it. What matters is that it is broken, with no easy way to intercept traffic for all apps.
This is an extremely user hostile position from Android and Google which is clearly meant to remove oversight over what apps send from the hands of the computer owner. I have no doubt they'll continue this cat-and-mouse game of trying to make it impossible to see the traffic generated by your own device.
Now this is something the EU should work on changing instead of trying to dismantle E2E encryption.
This thread is simply about a user-visible warning screen in Chrome. It has nothing to do with apps, etc. And the warning looks like it's skippable, since it has an "Advanced…" option. Not sure how this is supposed to impact dev workflows or be user-hostile in any way.
It does have to do with the overall topic of traffic inspectability.
As explained above, a relatively recent change in Android makes applications not trust user store certificates by default, except if an application explicitly opt into that. ~None of them do, except Chrome.
The solution to that problem was to install the certificate into the system store. But now Chrome considers all system store certificates to be public ones and requires CT for them.
So now there's no way to install a certificate to be able to inspect traffic from both Chrome and other applications at the same time. (If a certificate is in both the system store and in the user store, the system store version takes precedence, so Chrome would still require CT.)
There's a Chromium bug the author of the article filed to document this regression and you can already see a Chromium dev argue that "reverse engineering" (i.e. the ability to inspect the traffic your own device produces) is "understandably" not an addressed scenario: https://bugs.chromium.org/p/chromium/issues/detail?id=132430...
To be clear, this particular change isn't the end of the world, but none of them are since they're just using the slow frog-boiling method. Each change makes it a little bit harder until eventually it won't be possible at all.
> The problem here is that Android apps have to opt in to loading user-imported certificates.
Yes, but since Android N introduced this change, I haven’t met a single non-browser app that opted in to trusting the user store, or offered an option to do that. Maybe some enterprise apps do that? So it’s practically broken for any non-browser app; as for browsers I’ll just use a desktop one...
Ah, thanks for the detailed explanation. Have been wondering this for a while.
So if I read it correctly the major difference between the two platform is the opt-in/out part.
On iOS I can sniffer some random small apps trivially, since most of them don't enable pinning; on the other hand for android it's default on ( so I have to manually patch the apks everytime.
IIRC "have to opt in to loading user-imported certificates" wasn't the case a few generations of Android ago, correct?
Chrome and many other browsers will load these certificates just fine if you install them the official way. Apps that specify they trust the user store will also load them without any issues.
The method that's now broken fails because the author is using a workaround: with root permissions, the system store can be altered, which apps do trust by default. Chrome, however, is following best practices and enforces that certificates are logged in the certificate transparency log. This isn't done for user-imported certificates for obvious reasons, but it's applied to system certificates to prevent rogue CAs from faking certificates without exposing themselves to the world.
This means the workaround no longer works, or at least not as easily. There are still workarounds to fix the workaround, like the flags the author suggests here. It was never a supported way of doing things and unsupported workarounds are bound to break at some point.
I don't know how iOS deals with certificates, I suspect it's something sensible when the normal API is used (opt-out of user certificates, that is). However, apps like social media and messengers will often include certificate pinning that is impossible to get around without jailbreaks + modifying runtime code through tools like Frida. They include the hash of their (intermediary) certificates in the application itself and validate that the chain is signed by a valid certificate with that specific hash. That way, a malicious certificate authority can give out a "valid" certificate that's useless for MitM-ing your app's users!