Hacker News new | past | comments | ask | show | jobs | submit login

Passwords in iCloud Keychain are already E2EE, it seems reasonable the private passkeys would be too.



> iCloud ... backup

> E2EE

If you can lose all your existing devices, and can still restore your data, then that data isn't end to end encrypted.

I'm taking the "end" in e2ee to mean your devices. Nothing but your devices can decrypt your e2ee prospected data. If a new device can enter the circle of trust without an existing device's corporation then there is a backdoor.

I imagine icloud keychain supports synchronization rather than backup


The password stored in your backup via iCloud Keychain use the passcode of your devices as a secondary encryption/lock method, which doesn’t have a password recovery mechanism like the Apple ID used to secure your iCloud backup. Not sure that meets the definition of E2EE but it’s not like the passwords are recoverable by another party (or even you, if you forget the passcode) just because they’re in your iCloud backup.


So maybe I don't get it, but I always understood that 2FA means something you know and something physical you have. Now if I can get they keychain using something I know, does that not somewhat defeat the purpose of 2FA?


In general it's "who you are" (biometrics) as well as "what you have", with the OS being the one ensuring that the phone itself was unlocked and having an extra biometric check when signing in with passkeys; this is how iOS currently works, it pops up face ID before it signs any Webauthn challenges.

Also, ideally, your syncing passkey solution (whether that be 1password or iCloud Keychain) would itself be a combination of multiple factors before you can get in - in the case of iCloud Keychain, 2fa is on by default on your Apple account, and the keychain is also protected by your password plus the passcode of one of your devices. In general this is already immensely more secure than passwords because the website is verifying a signature instead of the correctness of a shared secret. So, it'd still be possible to have 2fa with the first factor being passkey and the second factor perhaps being another physical security key or maybe verification of an email code, but that would likely be reserved to enterprises and high-security applications.

(I assume Apple themselves aren't going passwordless themselves anytime soon, especially with how that'd work on fresh devices).


Typically MFA is something you have (physical possession), along with something you know (secret) or something you are (biometric).

This is more abstract than physical possession of a single device with a non-exfiltratable private key. There are synchronization processes (so its one of many physical devices, on a sync fabric which allows devices to be added).

The process for adding a device should require multiple factors as well, but I believe there ultimately is a typically a recovery mechanism like a printed recovery key which would make this considered single-factor.

However, most deployed 2FA is via SMS, email, or backed-up TOTP today. The goal is to build a much more secure system that is recoverable enough to get consumer adoption, not to try to achieve say NIST 800-63 AAL3.

One ongoing proposal is that you get an additional device-bound factor as well. Seeing a new device-bound factor would let you decide to do additional user verification checks if desired.


Maybe usage of user account password would allow for E2E without any device?


How could one verify that? like for compliance audit?


https://support.apple.com/guide/sccc/introduction-sccccea618...

Introduction to Apple security assurance

As part of our commitment to security, Apple regularly engages with third-party organizations to certify and attest to the security of Apple’s hardware, software, and services. These internationally recognized organizations provide Apple with certifications that align with each major operating system release. …


Are such third parties listed? Can you inspect their reports? What testing methodologies are involved in order to issue such certifications? And can we see such certifications at all?


If you don't trust Apple, why would you trust a third party auditor?

I can't think of any entity I would trust with securing truly sensitive information. For important stuff, do it yourself. For simple things, including bank accounts and such, I see no issue with trusting Apple.


Because you’re trusting both apple and the third party jointly, each of whom have different incentives.

I don’t know I buy the “for truly sensitive stuff do it yourself” line. That’s like saying for the truly lethal substances handle them yourself. Most people aren’t more skilled than the apple security folks. You’re almost certainly going to screw up your encryption or leave some vulnerability unpatched or unknown. Frankly I consider my iOS devices to be some of the most secure systems I have access to, and reading through their security documentation has informed that opinion.


> Because you’re trusting both apple and the third party jointly, each of whom have different incentives.

The cynical view, of course, is that Apple's incentive and the Third Party's incentive can become very much aligned for the right amount of money.


You also have to consider the market value of their reputations jointly as well. It would have to be a huge incentive to risk their reputation, both apples with their security conscious customers and customers with high regulatory burden, and the auditor whose only asset of value is their reputation. Auditors typically poof out of existence (Anderson anyone?)


Trust requires transparency and a published security audit report created by a reputable independent author would definitely increase my trust in Apple because they show that they don't have anything to hide.


Yes, particularly if you have need to. But a lot of the details you mention are findable in that link




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: