The problem with any remote arrangement is that you have to trust Apple that the server side is running all that stuff. Their answer to that is "you can audit us", but I don't see how that would prevent them from switching things in between audits.
As far as local processing goes, though, you're also still fundamentally trusting Apple that the OS binaries you get from them do what they say they do. Since they have all the signing keys, they could easily push an iOS update that extracts all the local data and pushes it to some server somewhere.
Now, I don't think that either of these scenarios is likely to happen if it's down to Apple by itself - they don't really gain anything from doing so. But they could be compelled by a government large and important enough that they can't just pull out. For example, if US demanded such a thing (like it already did in the past), and the executive made a concerted push to force it.
> When a user’s device sends an inference request to Private Cloud Compute, the request is sent end-to-end encrypted to the specific PCC nodes needed for the request. The PCC nodes share a public key and an attestation — cryptographic proof of key ownership and measurements of the software running on the PCC node — with the user’s device, and the user’s device compares these measurements against a public, append-only ledger of PCC software releases.
> compelled by a government
Sadly, the bar is much lower than "compel". Devices are routinely compromised by zero-day vulnerabilities sold by exploit brokers to multiple parties on the open market, including governments. Especially any device with cellular, wifi or bluetooth radios. Hopefully the Apple C1 modem starts a new trend in radio baseband hardening, including PAC, ASLR and iBoot, https://www.reuters.com/technology/apple-reveals-first-custo...
> Their answer to that is "you can audit us", but I don't see how that would prevent them from switching things in between audits.
PCC does actually prevent Apple from switching things in between audits to a high degree. It’s not like a food safety inspection. The auditor signs the hardware in a multi party key ceremony and they employ other countermeasure like chassis tamper switches. PCC clients use a protocol that ensures whatever they are connecting to has a valid signature. This is detailed in Apple’s documentation.[1]
See, this is why I think privacy engineering is low key the most cutting edge aspect of server development. Previously held axioms are made obsolete by architectural advancements. I think we’re looking at a once in 15 year leap - the previous ones being microservices and web based architecture.
Isn't local processing on Apple devices rooted in the same secure enclave hardware/firmware, attacked and hardened for 10+ years?