Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In my country, India, these platforms are used less for free speech and more for brainwashing and spreading hate and misinformation. Most of these posts are in Hindi, a major language around here, and call for all kinds of hate such as suppression of a specific religion, call for genocide, invading and acquiring neighboring countries etc.

I've tried reporting such posts multiple times but hate filled posts are neither removed, nor restricted. If a platform cannot provide adequate moderation, it should stop operating in my country and be held responsible for providing a platform for spreading hate pseudo-anonmously.



How is this example any different than what the parent has said? If people feel these platforms are negatively impacting them, they should stop using them. Or do you believe others or the government has a right to disallow what people want to watch via their own choices? You may call it brainwashing but others may disagree.


> If people feel these platforms are negatively impacting them, they should stop using them.

The problem is how to stop the mob attacking you from using them.

An American equivalent might be if social media existed a 100 years ago and was being used to encourage lynchings. Yes, it really is that bad in some places.

The problem is that FB does moderate things relevant to the US but ignores the rest of the world. They will remove white supremacist material in the US, but not the equivalent elsewhere.


The solution is in the problem, network effects. If everyone stopped using them due to deleterious effects, the problem is would solve itself.


Yes, but how do you get "everyone" to stop using the? I use FB purely because of network effects. I hate it, but there would be a real cost to not using it.


What is the cost? The excuses in this entire thread are quite weak and overblown. Some people are really saying their kids should continue using these apps they know are harmful simply because they would be socially outcast otherwise, which is simply not true.


Why do you think a lynch mob would stop using social media when they are excellent tools for them to use to organise lynchings?


Notice how the government made lynchings illegal, not the method of communicating such actions.


> do you believe others or the government has a right to disallow what people want to watch via their own choices

Yes. As an extreme example: watching cheese pizza is not allowed by governments. We have collectively also come together to consider murder as socially and legally unacceptable. We can and should regulate social media if posts read as follows:

- we should invade and bomb that country to bits - we should destroy all places of worship belonging to XYZ religion - we should vote for XYZ because only he is going to save our religion from PQR - and much worse which I can't type here as moderation team of HN would omit those

IMHO: give the current form of social media another few decades and it will come out shinning bright just like opioids did in the USA.

These same social media platforms, when required by law, become very effective in moderation but there's next to no moderation in my country and most of the hate and abuse is counted as just another engagement metrics.


Watching CP and murdering people is in no way comparable to any of those bullet points you made. Generally in the US at least, uniquely among many nations, the principle and constitutional right of the freedom of speech reigns supreme over many, many others, so there is no chance that any of those bullet points would (or should, given such a principle) be regulated.

One can and should be able to espouse those beliefs, regardless of whether they are true or not, because the alternative is much worse, where the rights of such exposition are severely curtailed. Hell, someone got arrested for taunting the Queen in the UK, something that legally cannot happen in the US had a similar person taunted a government official.


Hate speech and incitements to violence against the Rohingya precipitated for years on Facebook. Deleting the app would not have saved the Rohingya from getting genocided.


I have seen some of the same with Sri Lankan posts. Loathsome stuff in Sinhala. Not calling for genocide, but definitely encouraging persecution and bigotry. One group that was particularly poisonous was removed after a campaign by many people. One person complaining gets nowhere. I am sure there is more similar material elsewhere.

I think the underlying issue is that American companies view everything through the lens of American culture and if its not a problem in the US, then it is not offensive.

I once reported a racist comment on FB. Someone said that people of their race should not "interbreed" with people of another race because the latter are evil. FB said it did not violate their community standards.

IMO it was probably because it was a comment by a black person (probably American) about white people. That is not the major problem is the US so its fine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: