So in your view it's perfectly ok to run an internet service where you don't check for CSAM? You're entitled to that view I suppose but it's a minority view and allowing CSAM on your platform is illegal in most places (including the US where there's a carve out for that sort of stuff in the law that protects Google, Facebook et al) and in all of those jurisdictions the public is not going to be able to see the CSAM or be able to examine the substance of the allegations either before or during trial.
Every social network or file sharing site that I've been aware of has a Trust and Safety department for just this reason even X. The executives don't want to go to jail.
> So in your view it's perfectly ok to run an internet service where you don't check for CSAM?
Well, that's quite the assumption. The commenter you've replied to said nothing like this. And yet this is your first conclusion?? Is this how you operate in real life, at your job?
Telegram does moderate for CSAM. The claim that it does not is completely unsubstantiated. You can find CSAM across Meta's products. Does that mean they do not check for CSAM? No.
They ignore taking action when confronted with it. That's why Durov is a disgusting human being.
"the app had gained a reputation for ignoring advocacy groups fighting child exploitation.
Three of those groups, the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation, all told NBC News that their outreach to Telegram about child sexual abuse material, often shorthanded as CSAM, on the platform has largely been ignored."
Every social network or file sharing site that I've been aware of has a Trust and Safety department for just this reason even X. The executives don't want to go to jail.