First of all, your analogy of "sending officials to your home without a warrant" is again you completely misunderstanding the feature.
The local scanning would've been enabled only if you would've uploaded the photos to iCloud anyway.
To complete your crappy analogy: You had already agreed to send the information of everything contained in your apartment to a company, and now you're getting into a hissy-fit when they are sending a robot visit your house with a list of known child pornography photos and checking if you have any?
There weren't any "hard decisions" it wasn't a "this has a naked person in it" -scanner. It wasn't an "abuse checked" it was specifically created to find known CP.
It checked your photos against a NeuralHash of KNOWN child pornography. So the only chance your sauna photos would've been flagged if they were included in a set of CP photos distributed widely enough so that they're added to the CSAM database.
And nobody claimed the system would magically cure the world of grooming, where did you get that idea from? Although the current filters in iMessage might help if the abuser is stupid enough to use that for grooming.
Sorry for my previous posts' hostile nature. Thank you for clearing up the actual proposed implementation details. It sounds like I came to pretty much the same conclusion as you did. I might have been quite unclear, but I was actually referring to (in my mind) the inevitable stage 2 of the implementation, if you give the little finger here ... theres no stopping. When I read about this, I was feeling sad, frustrated and a bit angry too. I like my iphone, i switched from android because they aged too quickly. Given the nature of Apple ecosystem, I took this as crossing the line. And i would have to strongly look for alternatives, since this scanning is not something i want to accept and i would vote with my money elsewhere.
Yes, i know they scan all photos already. But to scan MY photos and hold me accountable, that is something i am not okay with. Yes, its only icloud, for now, but i dont see the point of using some 3rd party cloud, i used to, but in apple device it would be suboptimal for my daily use etc etc.
My point was of the original case of protecting children. That is what the advocate groups want. That is what I want. But I just draw the line there what i feel is my personal space, that you don't scan my photos or files, and that is not the answer. Since there are these few behemoth FAANG companies ruling the ecosystems and impacting our lives, the battle of privacy needs to be fought right there. So, for me that would be the line as I see the progression of that path is much worse. That is where I raise my hand and say, I'm out, what next? No smartphone? Maybe. :) Peace.
The local scanning would've been enabled only if you would've uploaded the photos to iCloud anyway.
To complete your crappy analogy: You had already agreed to send the information of everything contained in your apartment to a company, and now you're getting into a hissy-fit when they are sending a robot visit your house with a list of known child pornography photos and checking if you have any?
There weren't any "hard decisions" it wasn't a "this has a naked person in it" -scanner. It wasn't an "abuse checked" it was specifically created to find known CP.
It checked your photos against a NeuralHash of KNOWN child pornography. So the only chance your sauna photos would've been flagged if they were included in a set of CP photos distributed widely enough so that they're added to the CSAM database.
And nobody claimed the system would magically cure the world of grooming, where did you get that idea from? Although the current filters in iMessage might help if the abuser is stupid enough to use that for grooming.