The extreme hysteria created by anything related to children often seems to be carte blanche to destroy privacy and implement backdoors in applications. Most child abuse comes from family members (which must be solved at the source), and the ultra extreme cases simply make awful law (doing away with E2EE or instituting mass surveillance to catch an incredibly small minority is absurd).
Much like other 'tough on crime' measures (of which destroying E2EE is one) the real problems need to be solved not at the point of consumption (drugs, guns, gangs, cartels) but at the root causes. Getting rid of E2EE just opens the avenue for the abuse of us by the government but in no way guarantees we'll meaningfully make children safer.
And no, we are not 'condoning' it when we declare E2EE an overall good thing. Real life is about tradeoffs not absolutes, and the tradeoff here is protection from the government for potentially billions vs. maybe arresting a few thousand more real criminals. This is a standard utilitarian tradeoff that 'condones' nothing.
Don't worry, we fill our homes and pockets with enough cameras and microphones from private companies that the government can require the monitoring of everyone in every family 24/7 to make sure we're finally safe!
> The extreme hysteria created by anything related to children often seems to be carte blanche to destroy privacy and implement backdoors in applications.
Yes, and you can tell because the proposed solutions attack privacy when alternative solutions exist.
For example, simply deleting CSAM material from devices locally without involving any other parties could have achieved the goals without privacy violations.
It makes me somewhat uncomfortable to argue for not involving other parties (like the police) in cases where real CSAM is found on someone’s device. Same as most people, I think that CSAM is morally reprehensible and really harmful to society. But just deleting it en-masse would have been an effective and privacy-respecting solution.
I think it’s important to see nuance even in things we don’t like to think about. Not everything that has a price tag has a price. We were told we needed to give up privacy, but that wasn’t necessary to take CSAM out of circulation.
Ah... But you see, now you're arguing in "bad faith" because you're trying to protect the kiddie diddlers!
...Understand I don't see it that way and applaud you for your way of thinking. I too have had to wrestle with the very uncomfortable "bed fellows" as it were that adhering to consistent application of principles inevitably results in.
The fact remains though that in a large swathe of the population, the ripping off and sacrifice of personal privacy is considered a small price to pay to inflict harm on that subpopulation. My issue comes in in that once you make the exception for one subpopulation, the slope is set.
Though even your "just delete it" has dystopian ramifications. Imagine that were implemented like Tianenmen Square? Recordings that are not blessed? You're still leaving in somebody's hands essentially executive control over what information can be allowed to exist, which is an unconscionably powerful lever to build.
This is one of those rare circumstances where "do nothing and clean up the mess" may be the most wise course of action.
> Most child abuse comes from family members (which must be solved at the source)
Yes. Since becoming an abuser is a process and not a moment, part of the solution must be making access to CSAM much harder.
> And no, we are not 'condoning' it when we declare E2EE an overall good thing.
Agreed. I'm sorry if I worded things in a way that caused you to see an implication which was not intended. To be clear: E2EE is a good thing. Championing E2EE is not equivalent to condoning CSAM.
What I did say is that in failing to try and provide any meaningful solutions to this unintended consequence of E2EE, the industry is effectively condoning the problem.
> This is a standard utilitarian tradeoff
If that's the best we can do, I'm very disappointed. That position says that to achieve privacy, I must tolerate CSAM. I want both privacy and for us not to tolerate CSAM. I don't know what the solution is, but that is what I wish the industry were aiming for. At the moment, the industry seems to be aiming for nothing but a shrug of the shoulders.
We have all this AI now, maybe give the people who want that stuff realistic enough victimless content and we can see if cases drop? It seems like we're approaching a time when the technology makes it possible to test the theory and get an answer anyway.
That position says that to achieve privacy, I must tolerate CSAM. I want both privacy and for us not to tolerate CSAM.
Not true, you can have privacy and at the same time not tolerate child pornography, those are two perfectly compatible positions and arguably the current state. What you can not have - by definition - is privacy on the one hand and on the other hand no privacy in order to look for child pornography. You can still fight child pornography in any other way, but when it comes to privacy, you have to make a choice - give people their privacy or look through their stuff for illegal content, you can not have both. If you have enough evidence, a judge might even grant law enforcement the permission for privacy violating measures, it should just not be the default position that your privacy gets violated.
...Now that, is a well conceived viewpoint, but I still argue policy-wise, that no penalties can conscionably be assessed for failure to engage in said activity without spreading the taint of deputization, which de-facto unmakes your stance. As a government deputy, you don't have that privacy. If you are not compelled to act as a deputy of the State, I can accept you escalating to NCMEC, but we also have to accept there is no recourse for providers who turn a blind eye to the whole thing.
Failure to recognize this perpetuates the fundamental inconsistency.
> Yes. Since becoming an abuser is a process and not a moment, part of the solution must be making access to CSAM much harder.
This is a very big assumption. Sexual abuse of minors has existed long before the internet, and long before photography. The notion that less availability of CSAM leads to less real-world abuse is not at all clear.
> If that's the best we can do, I'm very disappointed. That position says that to achieve privacy, I must tolerate CSAM. I want both privacy and for us not to tolerate CSAM. I don't know what the solution is, but that is what I wish the industry were aiming for. At the moment, the industry seems to be aiming for nothing but a shrug of the shoulders.
As other commenters have pointed out, the solution is to prevent children from being abused in the first place. Have robust systems in place to address abuse, and give kids effective education and somewhere to speak out if it happens to them or someone they know.
> Since becoming an abuser is a process and not a moment, part of the solution must be making access to CSAM much harder.
In my opinion CSAM is a symptom, not a cause.
It's difficult to "stumble across" that kind of material unless you're already actively looking for it, which means some amount of "damage" is already done.
I also highly doubt that someone with no proclivities in that direction would 'turn' as a result of stumbling across CSAM. I'd guess they'd go the other way and be increasingly horrified by it.
> It's difficult to "stumble across" that kind of material unless you're already actively looking for it
It's entirely likely that borderline cases (15-17 years old) are seen by millions of people without them realizing it. Pornhub is a popular "mainstream" porn website that has issues with CSAM and routinely removes it when found. It's entirely possible for "normal" consumers of pornographic material to stumble into CSAM in places like that unknowingly. When you're talking about obviously prepubescent children, I'm in full agreement that it's almost always restricted to folks who seek it out explicitly.
>>That position says that to achieve privacy, I must tolerate CSAM. I want both privacy and for us not to tolerate CSAM.
I want to have more money than Elon Musk..... sometimes life is not fair and we can not always get what we want...
Any "backdoor" or "frontdoor" in encryption is a total failure of encryption. That is a immutable truth, more fixed in reality than the speed of light is in physics.
This is the part that's really infuriating to me that I've seen in several comments - this implication that the onus is on programmers or people who work in technology to somehow figure out a way to do this impossible thing, as if nobody has tried. And then they have the gall to say something like "I want both privacy and for us not to tolerate CSAM." without proposing any kind of theoretical solution.
If you can't even propose a hypothetical science fiction-esque way of achieving this, much less one that might actually be implementable now or in the near future (like, 5-10 years), you shouldn't get to have that opinion.
Much like other 'tough on crime' measures (of which destroying E2EE is one) the real problems need to be solved not at the point of consumption (drugs, guns, gangs, cartels) but at the root causes. Getting rid of E2EE just opens the avenue for the abuse of us by the government but in no way guarantees we'll meaningfully make children safer.
And no, we are not 'condoning' it when we declare E2EE an overall good thing. Real life is about tradeoffs not absolutes, and the tradeoff here is protection from the government for potentially billions vs. maybe arresting a few thousand more real criminals. This is a standard utilitarian tradeoff that 'condones' nothing.