It's a password manager with cryptographic vendor lockin.
There are definitely some benefits though, such as immunity from phishing. Surely we as the industry can bring them about in a way that doesn't involve cryptographic vendor lockin.
There is no password so it can't be a password manager. Without a password it avoids all the downsides of passwords like having to store them securely on both ends, rainbow tables, credential reuse, weak password choice, and having to remember them. It's a cryptographic keypair manager. Key management is always the barrier to really good real world cryptography, so I'm heartily in favour. Anything that makes it possible for regular people to use strong cryptography is a huge win.
Since it's all just FIDO2/webauthn under the hood it's hardly lockin. It's a bit of Apple UI tinsel to make life simple and their excellent icloud keychain sync.
The industry doesn’t seem to have a working software solution for mobile phone authentication secrets that both is 1) immune to persuading a user to export their data (to get phished), and 2) allows a user to export their data at any time (to prevent lock-in).
What would it look like to do #2 safely, without enabling the phishing that we see today with #1?
I get where you're coming from and you're not wrong, but at the same time, I don't buy this as an excuse for vendor lock-in here, because it seems like Apple is already backing up passkeys to iCloud.
If Apple has decided that the risk of getting your passkeys phished out of your Apple iCloud Account is outweighed by the benefit of users being able to restore/sync login details immediately when they buy a new iOS device and log into it, then I think it's reasonable for users to expect the same treatment and the same experience when they're moving away from iOS.
If Apple wasn't backing up any of the logins, and they had committed to when you trade in your phone and upgrade to the latest iPhone forcing you to manually re-create all of those keys one-by-one using your recovery option, then I'd accept not having an export option for Android/Linux/Windows. Otherwise, it will just seem really suspiciously convenient to me if they ultimately decide that exporting keys is acceptable risk unless it's to a competitor's device.
As far as I can tell, there hasn't been any official confirmation that users won't be able to export them to non-iOS devices, so maybe it's all worry over nothing. But I don't think security is a justification to apply restrictions specifically only on devices outside of Apple's ecosystem.
I don't consider this solution an excuse for vendor lock-in. I consider this a problem that has no known solutions without vendor lock-in.
If you offer users a way to export, then you offer phishers a way to social engineer users. So either you prevent social engineering (lock-in: yes), or you allow exports (lock-in: no).
Which choice has a higher precedence when serving the market of "non-technical mobile phone users"?
I think you're misunderstanding what I'm saying. Apple IS allowing users a way to export right now, we just don't know whether non-iOS devices will be supported.
I consider the binary you describe to be a justifiable reason for Apple to offer no way to export from a phone, but that's not what they're doing. And I do not consider it a justifiable reason for Apple to allow only exporting between iPhones.
Apple is syncing passkeys to iCloud, presumably so they can be synced between devices and restored if a device is lost/destroyed. That's an export option, and iCloud syncing/restoration between phones is vulnerable to phishing attacks, but Apple has decided that the user experience without iCloud backup would be so bad that they're excusing the extra risk that users have their iCloud account phished and their keys synced to an attacker's phone.
> Which choice has a higher precedence when serving the market of "non-technical mobile phone users"?
In Apple's case, they have decided that allowing users to recover accounts easily is more important for non-technical users than protecting them from export phishing attacks. They've very explicitly said here that they think that allowing export is more important than preventing phishing.
We can debate whether Apple made a good choice with that, but having made that choice, there is now no reason for them to say that Android transfers would be a unique security threat.
Your choice is to allow people to be phished for credentials, then.
Gullible people will cheerfully complete any attacker-described PC syncing process, ignoring every security warning presented to them, in order to give away the keys to their accounts. They’ll use a friend’s PC, or a library PC, or anything under the sun, if the phished promises to give them something for nothing.
Apple is already remotely backing up passkeys off-device.
We are having a debate about an Apple policy that doesn't exist. Apple is not following the "keys never leave your device" model, so that security model has nothing to do with whether or not Apple will engage in vendor lock-in.
We're not making the choice to leave users vulnerable to phishing attacks, Apple made that choice, and we're arguing that because they made that choice they have no excuse to also engage in vendor lock-in.
As far as I know, Apple requires iCloud password and PIN entry on an Apple hardware device being paired to iCloud to access Keychain data, and tends to block Apple devices by hardware ID when they’re associated with bulk login attacks. The attacker surface for phishers is exorbitantly expensive, since they’d need to have a shipping container full of iPhones to even begin harvesting credentials, assuming that they could convince users to turn over their iCloud password (which half of my friends don’t even know).
This is how vendor lock-in allows protections against phishing that a naive data export would bypass. No one has yet suggested how this level of protection can be offered to end users without lock-in, across many such posts and threads, for many years now. I remain hopeful that there’s another way, but I’m not going to demand Apple do insecure exports at the expense of users in the meantime.
> As far as I know, Apple requires iCloud password and PIN entry on an Apple hardware device being paired to iCloud to access Keychain data, and tends to block Apple devices by hardware ID when they’re associated with bulk login attacks. The attacker surface for phishers is exorbitantly expensive, since they’d need to have a shipping container full of iPhones to even begin harvesting credentials, assuming that they could convince users to turn over their iCloud password (which half of my friends don’t even know).
It's not my intention to necro an old thread, but I've been away for a while and haven't seen this. For the record, I have never seen a convincing argument for why these standards couldn't be applied to other platforms, particularly now that hardware attestation is a thing. It is very convenient to Apple that the line between what hardware they trust and what hardware they don't begins and ends with their own devices, even though there are plenty of devices on the market that could be verified using similar hardware checks.
Additionally, I don't really see how access to iPhone hardware is a deterrent against phishing. It makes it harder, maybe, a little bit, but it doesn't eliminate the problem. There's nothing in this scheme that I can see that Apple has published that says that it won't allow backups to be restored to used iPhones. Maybe I've missed something in the docs I've read, but I don't see why you would need multiple iPhones for this at all. Reuse the same one multiple times.
A password and pin is not a defense against fishing, and saying that criminals won't have access to a mass-market consumer device to me seems really naive. People do get phished out of their iCloud accounts, they're not magic. Getting users to turn over their iCloud passwords is how existing iCloud phishing attacks happen today.
What we see with the above scheme is Apple deciding that completely eliminating phishing attacks isn't as important as allowing iPhone users to back up their keys. Where they arbitrarily draw the line about how much phishing risk they're willing to target, and whether the location where they draw that line seems specifically designed to create the most vendor lock-in possible -- I think that's something that's worth criticizing. And I think characterizing the place where they've drawn the line as if it's a fact of nature rather than a conscious decision to decrease security and allow phishing attacks but only when it benefits Apple to do so -- I think that's an excuse.
The reality is that Apple's current implementation is vulnerable to phishing, and Apple has decided that leaving that vulnerability open is worthwhile for users. If I have access to your iCloud credentials (which are vulnerable to phishing attacks) I can restore your login keys to an iPhone I'm holding and then use those keys to access your other accounts.
> No one has yet suggested how this level of protection can be offered to end users without lock-in
To be clear, I'm not certain I would have any objection to Apple offering a secure level of protection that guarded against phishing, even if it resulted in some lock-in. But they don't. Hardware restrictions for mass-market devices are not a defense against phishing.
I am not demanding that Apple make its products less secure, I am demanding that Apple not pretend that security is the reason it's restricting its devices at the same time that Apple exposes its users to the same phishing risks within its ecosystem.
There is no reason why Apple could not (using the same access system they've already decided is good enough for iPhone restoration) also allow users who move ecosystems to access iCloud the same way and restore to certified Android devices. There wouldn't be any loss of security there beyond what Apple has already decided it's comfortable with.
There are definitely some benefits though, such as immunity from phishing. Surely we as the industry can bring them about in a way that doesn't involve cryptographic vendor lockin.