You didn't provide the link to SIO report, but I assume this is it: [1]. The report is mostly dedicated to teenagers trying to find ways to sell self-filmed content. You cherry-picked claims against Telegram to make allegations look more serious than they are, and didn't mention that there are more serious claims against Western platforms.
This is a quote from the beginning of the report:
> Large networks of accounts, putatively operated by minors, are openly
advertising self-generated child sexual abuse material (SG-CSAM) for sale.
(by the way this might be because it is very difficult to find a legitimate job if you are a teenager without any natural skills and talents. Why doesn't government do anything to change this? Where are teenagers from poor families supposed to get money from?)
> Instagram is currently the most important platform for these networks, with
features that help connect buyers and sellers
> Instagram’s recommendation algorithms are a key reason for the platform’s
effectiveness in advertising SG-CSAM.
> Twitter had an apparent regression allowing CSAM to be posted to public
profiles, despite hashes of these images being available to platforms and
researchers.
Can we expect to see Musk and Zuckerberg in the same jail with Durov then? Or justice doesn't apply to everyone equally?
Note also that the report gives following recommendations in the conclusion:
> When an account is identified as selling SG-CSAM, disabling the account should be accompanied by messaging to the seller to attempt to discourage recidivism. This messaging might include:
> The fact that this content is widely illegal and can result in prosecution;
being a minor does not prevent legal consequences
So basically what reports suggests is not to do something to help teenagers from poor families to find a legitimate job, but to threaten them with a jail term for selling their own photos. So American!
> A June report from the Stanford Internet Observatory found that Telegram was the only major platform not to forbid illegal material in private channels and chats. “
If you read the report, this means that Telegram's ToS do not explicitly forbid to post illegal material in private groups. But do you need to forbid explicitly what is already forbidden by the law?
The report contains further claim though:
> It further states that “All Telegram chats and group chats
are private amongst their participants. We do not process any requests related to
them
This is alarming but this not exactly how it works because you can actually report messages even in one-to-one private chats, for example, if you get spam from a new contact, and they can get blocked. I never got illegal material from contacts (only spam) so I don't have experience reporting it.
> Telegram has also been observed by SIO as failing to perform even basic content enforcement on public channels, with instances of known CSAM being detected and reported by our ingest systems
If you read further, by "failing to perform basic content enforcement" they mean that Telegram doesn't check posted images again CSAM database, and imply that Telegram is obliged to do this. However, I am not sure if the law requires this.
Now I want to comment on other vaguely written claims.
> Telegram is a key component of the ecosystem of individuals trading and selling child sexual abuse materials,
What makes Telegram a "key component"? Did Durov designed Telegram and added features with primary intent to make selling illegal materials easier? This sounds implausible.
> At the heart of the case is the absence of moderation
Does he mean a lack of pre-moderation (reviewing every message before posting) or lack of response to reports? There definitely is moderation in Telegram, so the "absense of moderation" doesn't ring true to me. If would be good if they presented more details instead of vague words.
> absence of ... cooperation
"cooperation" is a vague word. Maybe France just wants to be able to read all messages in private groups under an excuse of fighting crime? This would be a completely different story then.
No social network is perfect but talk to me when Twitter or Meta ignore CSAM agencies repeated requests. I'll be waiting. Otherwise, Telegram is complicit.
"the app had gained a reputation for ignoring advocacy groups fighting child exploitation.
Three of those groups, the U.S.-based National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection and the U.K.-based Internet Watch Foundation, all told NBC News that their outreach to Telegram about child sexual abuse material, often shorthanded as CSAM, on the platform has largely been ignored."
Ignoring reports of illegal material is one thing; ignoring invitations to join US-based programs or cooperating with them which Telegram is not required to do by law is different thing. The article for some reasons doesn't clearly states what it means; the author uses vague ambigious wording instead like politicians do.
The article mentions a 2023 report of SIO [1] on minors trying to earn money by selling their own photos online; the report mentions Telegram, but notes that Instagram and Twitter are worse:
> Instagram is currently the most important platform for these networks, with
features that help connect buyers and sellers
> Instagram’s recommendation algorithms are a key reason for the platform’s
effectiveness in advertising SG-CSAM.
> Twitter had an apparent regression allowing CSAM to be posted to public
profiles, despite hashes of these images being available to platforms and
researchers
Yet, for some strange reason Musk and Zuckerberg are not under investigation.
Note, that the report also doesn't give any recommendations to govts to help minors to earn money they need the legal way to solve the root issue.
This is a quote from the beginning of the report:
> Large networks of accounts, putatively operated by minors, are openly advertising self-generated child sexual abuse material (SG-CSAM) for sale.
(by the way this might be because it is very difficult to find a legitimate job if you are a teenager without any natural skills and talents. Why doesn't government do anything to change this? Where are teenagers from poor families supposed to get money from?)
> Instagram is currently the most important platform for these networks, with features that help connect buyers and sellers
> Instagram’s recommendation algorithms are a key reason for the platform’s effectiveness in advertising SG-CSAM.
> Twitter had an apparent regression allowing CSAM to be posted to public profiles, despite hashes of these images being available to platforms and researchers.
Can we expect to see Musk and Zuckerberg in the same jail with Durov then? Or justice doesn't apply to everyone equally?
Note also that the report gives following recommendations in the conclusion:
> When an account is identified as selling SG-CSAM, disabling the account should be accompanied by messaging to the seller to attempt to discourage recidivism. This messaging might include:
> The fact that this content is widely illegal and can result in prosecution; being a minor does not prevent legal consequences
So basically what reports suggests is not to do something to help teenagers from poor families to find a legitimate job, but to threaten them with a jail term for selling their own photos. So American!
> A June report from the Stanford Internet Observatory found that Telegram was the only major platform not to forbid illegal material in private channels and chats. “
If you read the report, this means that Telegram's ToS do not explicitly forbid to post illegal material in private groups. But do you need to forbid explicitly what is already forbidden by the law?
The report contains further claim though:
> It further states that “All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them
This is alarming but this not exactly how it works because you can actually report messages even in one-to-one private chats, for example, if you get spam from a new contact, and they can get blocked. I never got illegal material from contacts (only spam) so I don't have experience reporting it.
> Telegram has also been observed by SIO as failing to perform even basic content enforcement on public channels, with instances of known CSAM being detected and reported by our ingest systems
If you read further, by "failing to perform basic content enforcement" they mean that Telegram doesn't check posted images again CSAM database, and imply that Telegram is obliged to do this. However, I am not sure if the law requires this.
Now I want to comment on other vaguely written claims.
> Telegram is a key component of the ecosystem of individuals trading and selling child sexual abuse materials,
What makes Telegram a "key component"? Did Durov designed Telegram and added features with primary intent to make selling illegal materials easier? This sounds implausible.
> At the heart of the case is the absence of moderation
Does he mean a lack of pre-moderation (reviewing every message before posting) or lack of response to reports? There definitely is moderation in Telegram, so the "absense of moderation" doesn't ring true to me. If would be good if they presented more details instead of vague words.
> absence of ... cooperation
"cooperation" is a vague word. Maybe France just wants to be able to read all messages in private groups under an excuse of fighting crime? This would be a completely different story then.
[1] https://stacks.stanford.edu/file/druid:jd797tp7663/20230606-...