Hacker News new | past | comments | ask | show | jobs | submit login

> Parler has been, for all intents and purposes, killed.

Parler wasn't “killed” because planning took place there, but because they were openly at least unable if not actually unwilling to take action against a huge backlog of specific problem identified to them; the problem was current and forward, not retrospective.




Amazon has stated that it warned Parler for months without redress, but to offer another perspective, the CEO of Parler stated in an interview that they were notified the day before they got the plug pulled, sought to work with AWS on solving the issue, then were "deplatformed" the following day.

He said she said, but obviously both sides are incentivized to make themselves look clean


Though I doubt that Parler has many employees, it seems unlikely the emails from your hosting company would be read by the CEO. It's entirely possible both people are telling the truth.


Your CTO of your social media startup would definitely surface threats of deplatforming from your cloud provider to the CEO.


In a well run organization with great employees that is what you would expect to happen.


How did you come up with that conclusion? Honestly, just asking


Not the OP, but it's the reason AWS gave:

In an email obtained by BuzzFeed News, an AWS Trust and Safety team told Parler Chief Policy Officer Amy Peikoff that the calls for violence propagating across the social network violated its terms of service. Amazon said it was unconvinced that the service’s plan to use volunteers to moderate calls for violence and hate speech would be effective.

“Recently, we’ve seen a steady increase in this violent content on your website, all of which violates our terms," the email reads. "It’s clear that Parler does not have an effective process to comply with the AWS terms of service.”

https://www.buzzfeednews.com/article/johnpaczkowski/amazon-p...


Even that does not say any planning was done on parler.


This does [1]. According to the description, Parler was the "preferred" platform for planning of right-wing election-related violence. According to the video, now that Parler is gone, Telegram, a tool that over 500 million people use everyday for entirely legitimate purposes, is now nothing more than an outlet for Qanon. If this isn't a prima facie example of the completely balanced, factual, and accurate reporting by our friends at MSNBC, I don't know what is.

[1] https://www.msnbc.com/ali-velshi/watch/far-right-extremists-...


I read Aws letter not buzzfeed take on it. I advise you do the same


I did, and I see no distinction between what Parler was accused of and what the article we are discussing uncovered about Facebook. Actual planning - not just vague calls for violence - occurred, in plain view, on Facebook, and nothing was done about it. Therefore, if we apply the same standard, Facebook should not be operating this morning. Here’s a direct quote from the Amazon letter you referred to:

”...we cannot provide services to a customer that is unable to effectively identify and remove content that encourages or incites violence against others.”

The article we are discussing clearly found that Facebook meets precisely the same criteria. Therefore, services should not be provided to them, correct? Whoever provides their bandwidth undoubtedly has the very similar TOS...they all have similar provisions about network abuse.

Also, here’s a quote from the article you’re referring to:

”People on Parler used the social network to stoke fear, spread hate, and allegedly coordinate the insurrection at the Capitol building on Wednesday.

I don’t know why you and others on here continue to argue that a double standard, combined with either inaccurate reporting or outright lies, is not at play here - despite overwhelming and obvious evidence to the contrary. But it’s disingenuous and makes me sad not just for HN, but for the country at large.


I am not arguing against double standard at all. I am saying there is no evidence that riots were planned on parler and Aws never said they were


The parent post was pointing out that Parler was banned because of an ineffective to moderate content going forward (not if things were historically planned there or not).


If anything, the same is true for other platforms like Facebook and Twitter as well.


Parler was killed for one reason. They said they wouldn't ban Trump. It's probably tamer than Facebook, Twitter or even Reddit where one search and you can find threats and calls for violence.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: