Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your history of events seems... strange.

https://www.europol.europa.eu/newsroom/news/europol-and-tele...

It seems like to me, that Telegram is cooperating with police services to delist propaganda.

-------

If anything, you seem to be proving my point? Telegram is closely working with European officials to remove ISIS propaganda today.

> Over the past year and a half, Europol has been collaborating with Telegram in tackling terrorism online. Building upon a previous Referral Action Days with Telegram in October 2018, the event this year was an opportunity for both parties to review the kind of content that terrorist groups attempt to disseminate online and further improve the referral process with the common aim of ensuring that material glorifying terrorism would be removed from internet as soon as possible.

...

> Telegram is no place for violence, criminal activity and abusers. The company has put forth considerable effort to root out the abusers of the platform by both bolstering its technical capacity in countering malicious content and establishing close partnerships with international organisations such as Europol.

The above post is dated 2019, which means they were in close collaboration from 2017. In 2015, Telegram immediately purged ISIS content in response to the attacks on France.

-------

I'm very confused by your post. Could you elaborate on your version of history? Or maybe what you're trying to say? As far as I can tell, your Telegram example only supports the Parlor deplatforming. When extremist attacks occur, our only option is to silence the attackers.



In 2015, telegram started removing posts after the attack in Paris—but they’d linked attacks as early as 2014 back to telegram communications and recruiting. Even in 2015, attempts were half-hearted until a high profile expose (Vox) and Others like it ran.

There were never any calls for deplatforming telegram in this manner as far as I know; it would be like giving Parlor another 3-5 years to clean itself up.

I agree that the attackers need to be silenced, but this move by apple and google is unprecedented and unnecessary; if they are inciting violence, it should be dealt with legally, not via corporate ethics panels. Platforms should be responsible for content like what parlor presents, but apple shouldn’t be the who holds them accountable. This isn’t the cake shop declining to serve a gay customer, but rather the phone company pulling the plug.


After the Paris attack in 2015, it seems like Telegram immediately started working with authorities.

In contrast, Parlor double-downed on its behavior. It immediately became clear that Parlor admins were NOT working on clamping down the violent extremism that's occurring currently on its site.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: