I agree that those statements are correct, however my reading of the proposed Apple implementation was that it struck a good balance between maximising the ability to discover CSAM, minimising the threat vectors, minimising false positives, and minimising the possibility that a malicious government could force Apple to implement bulk surveillance.
I'm all for privacy, but those who put it above all else are already likely not using Apple devices because of the lack of control. I feel like for Apple's target market the implementation was reasonable.
I think Apple backed down on it because of the vocal minority of privacy zealots (for want of a better term) decided it wasn't the right set of trade-offs for them. Given Apple's aim to be a leader in privacy they had to appease this group. I think that community provides a lot of value and oversight, and I broadly agree with their views, but in this case it feels like we lost a big win on the fight against CSAM in order to gain minor, theoretical benefits for user privacy.
But "the ability to discover CSAM" is by itself an excuse for mass surveillance, not a bona fide goal.
It is certainly possible, instead, to investigate, then find likely pedophiles, and then get a search warrant.
Discovering users sharing CSAM is a goal isn't it? That's why governments around the world require cloud storage providers to scan for it – because waiting until the police receive a report of someone is not really feasible. A proactive approach is necessary and mandated in many countries.
imo diminishing ppl's privacy is a goal. Apple's csam could be tricked in different ways, esp with generative algorithms, like an malicious person will send you an album with 100+ normal looking photos(to the eye) but altered to trigger csam, now govt needs to check 100+ photos per person per send and dismiss the false positives. Since this can be replicated, imagine gov't will need to scan 100k similar usecases just for 1k ppl? that's insane, they'll either not check them, so system became obsolete(bc in this case ill intentioned ppl can just send an album of 5k photos, all triggering csam and only a bunch will be real csam. multiplied by nr of these ill ppl, you understand system is easy to game, or they spend thousands of hours checking all this photos and checking each person. Another vector of attack is generation of legit looking csam, bc, generating algorithms are too good now, but in this case(afaik) it's not a crime, since image is fully generated(either by only using ppl's face as starting point or using the description of their face tweaked enough to look realistic).
So what we get is:
- a system that can be gamed in different ways
- a system that's not proved to be effective before releasing
- a system that may potentially drive those ppl to other platforms with e2ee that don't have the csam scan(i assume since they know what e2ee is, they can find a platform without csam), so again obsolete
AND:
- a system that can't be verified by users (like is the csam list legit, can it trigger other things, is the implementation safe?)
- a system that can be altered by govt by altering the csam list to target specific ppl (idk snowden or some journalist that found something sketchy)
- a system that can be altered by apple/other company by altering csam list for ad targeting purposes
Idk, maybe i'm overreacting, but I've seen what a repressive gov can do, and with such an instrument it's frightening what surveillance vectors can be opened
I'm all for privacy, but those who put it above all else are already likely not using Apple devices because of the lack of control. I feel like for Apple's target market the implementation was reasonable.
I think Apple backed down on it because of the vocal minority of privacy zealots (for want of a better term) decided it wasn't the right set of trade-offs for them. Given Apple's aim to be a leader in privacy they had to appease this group. I think that community provides a lot of value and oversight, and I broadly agree with their views, but in this case it feels like we lost a big win on the fight against CSAM in order to gain minor, theoretical benefits for user privacy.