ok, and in theory, with new generative algorithms, do you think it's still ok? Suppose apple implements this, suppose someone finds a way to generate meme images that can trigger apple's algorithm(but human can't see anything wrong), suppose that someone wants to harm you and sends you a bunch of memes and you save them. What will happen? Or what does happen if somebody is using generative algorithm to create csam like images by using people's face as base but the rest of the image is generated, should this also trigger csam?
Also, you can not guarantee that apple/google will use only known instances of csam, what if, govt orders them/google to scan for other type of content under the hood, like documents or god knows what else bc govt want's to screw that person (for the sake of example let's suppose the targeted person is some journalist that discovered shady stuff and govt wants to put em in prison), bc you know, you don't have access to either algorithms and csam scan list that they are using, system could be abused and usually could means 'sometime' it will
These criticisms are reasonable criticisms of a system in general, but Apple's design featured ways to mitigate these issues.
I agree that the basic idea of scanning on device for CSAM has a lot of issues and should not be implemented. What I think was missing from the discourse was an actual look at what Apple were suggesting, in terms of technical specifics, and why that would be well designed to not suffer from these problems.
Also, you can not guarantee that apple/google will use only known instances of csam, what if, govt orders them/google to scan for other type of content under the hood, like documents or god knows what else bc govt want's to screw that person (for the sake of example let's suppose the targeted person is some journalist that discovered shady stuff and govt wants to put em in prison), bc you know, you don't have access to either algorithms and csam scan list that they are using, system could be abused and usually could means 'sometime' it will