Exactly, only "private" cloud data will be scanned instead, which is industry-standard practice for any self-respecting cloud provider anyway. It's a wonder how Apple wasn't doing it already.
In any case, this will be automated, rather than some poor Tier-1 pouring over iCloud Photos.
So only the guilty (and the false positives) would worry.
> So only the guilty (and the false positives) would worry.
If you truly want to "protect the children" you should have no issue for the police to visit and inspect your, and all of your neighbors houses. Every few days. Unannounced, of course. And if you were to resist, you MUST be a pedophile who is actively abusing children in their basement.
Innocent until proven guilty implies no false positives. What happens if I get arrested because of a false positive? What happens to my life because there will always be that doubt from everyone?
My social life is crippled for the rest of my life because of a false positive. Which can happen to anyone. Which means everyone should worry.
This is false, the scanning occurs on the phone. Plus, as has already been discussed at length, the NCMEC database is loaded with false positives.
The "nothing to hide" argument tends to fall apart when the database being used against you is full of legal imagery (which often isn't borderline or pornographic at all -- some of the flagged images literally don't show people).
Slippery slope to non-CSAM material? That ship has sailed already. The databases are a mess. From day 1, it detects non-CSAM.
In any case, this will be automated, rather than some poor Tier-1 pouring over iCloud Photos.
So only the guilty (and the false positives) would worry.