Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If they're bad at identifying misinformation, then I'll fault them for that. Has Google had a tendency to censor things on the basis of information that later turned out to be true? Your examples are the judgments of groups that aren't Google and judgments that I don't think the groups responsible for pushed for others to ever get censored for.


In the early days of the coronavirus, you couldn't use Google to find information at all because everyone other than the WHO was getting censored. I was using Bing for a while because their censorship was much slower.


But that is precisely the lesson history taught us. Misinformation, fundamentally, cannot be identified. It's easy to say "oh, everyone who believed that was an idiot" when talking about Galileo being thrown in prison for heliocentrism. If he was alive today, we'd be calling him a "far-right conspiracy theorist" or something equally as nasty.


I can't understate how much I disagree with that being an unambiguous lesson of history. There's been plenty of places where misinformation was identified and pushed out in favor of better information. But mainly, this is about an individual company deciding whether or not to participate in helping spread (mis)information, not about the government choosing to jail someone. It's terrible that Galileo was jailed and that should not have happened, but I don't think free speech should mean that any specific newspaper was legally obligated to run any articles he wrote.


> Has Google had a tendency to censor things on the basis of information that later turned out to be true

Yes. Youtube censored the lab leak hypothesis.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: