Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It depends - is Valve also claiming liability protection under section 230 as a information service provider?


Being an information service provider with Section 230 protection doesn't obligate that provider to allow anyone to post anything.


No, but it should make the platform similar to a public square. Taking a government grant of liability protection should come with obligations to the public.


> Taking a government grant of liability protection should come with obligations to the public.

Why? That's a serious question.

The reason liability protections exist for information services is because information services can't exist without them. It's not possible for an information service provider to be strictly liable for what their users post while having anything like a reasonable quality of service or cost. Think 10 hour moderation queues to post on Instagram, which costs $15/month, and requires using your driver's license or other photo ID to sign up so they can forward libel suits to you.

Requiring strict liability on the part of information service providers will likely drastically reduce the amount of free speech. Because they're going to aggressively take down/reject anything that has the slightest possibility of them getting sued. They won't suddenly morph into the town square because an unmoderated Internet town square is a filthy, ugly place that repels users and advertisers and hurts the bottom line.


"Requiring strict liability..." which is why I did not write that.

I would propose that moderation decisions be logged and reviewable on demand (at the plantiff's cost) in front of a reputable arbitrator of the platform's choice.


I have no idea if this is a good idea or not. If it makes some plaintiffs feel better that they're being heard, great. But I don't believe it would substantially alter the status quo. Anyway, you do you my friend.


Is that even possible at any sort of scale?


if the plantiffs pay, I think so. All the platforms would need to do is log the moderation decisions they already make.


> is Valve also claiming liability protection under section 230 as a information service provider?

What part of Section 230 do you think had any bearing on the issue being discussed?


Section 230 is a valuable grant of liability protection, and thus morally obligates platforms to act as a public forum. At the very least, moderation decisions should be written down and reviewable by a third party. The law should be reformed to reflect that.


> Section 230 is a valuable grant of liability protection, and thus morally obligates platforms to act as a public forum.

That makes no sense, since the entire, explicit purpose of Section 230 was to free platforms from preexisting disincentives si that they could be free to “restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected” without incurring liability that would otherwise attend such editorial control.

You are literally suggesting that the price of Section 230 protection ought to be not doing the things to which Section 230 protection applies in the first place.

Also, I think it's wrong, in any cases, to think of Section 230 as a special grant of liability protection, any more than the preexisting different liability of distributors from that of publishers is a special grant. It's a recognition that online media enables models that would not be practical with the media for which the traditional liability roles for content developed, and that venues for content which wasn't pre-screened in detail but over which largely reactive editorial control was exercised was all of technically and economically practical on the internet, inconsistent with the premises of the classic publisher vs. distributor liability analysis, and already, at the time the Act was considered, starting to be stifled by application of the traditional publisher vs. distributor rules. We don't force bookstores to be neutral platforms to be free of publisher liability, because we recognize that bookstores aren't naturally going to be pre-screening content at the detailed level print publishers will, and it doesn't make sense to stifle them by forcing them to. 230 is the same idea, just for a business activity that didn't exist when the common law of liability was evolving.


That's why section 230 needs to be re-written. I'm fine with moderation, and I'm fine with sites that publish user-generated content defining their own content rules. What I'm not fine with are rules that are unevenly enforced in order to play favorites. I would reform section 230 so say that you must post the site's user-generated content rules, and you must post a moderation log. The moderation log would consist of the list of moderation actions taken and the accounts affected, and the time. Any moderation action that appears to not align with the stated content policy should be reviewable by a third party accredited arbitrator, at the plantiff's expense. The net effect of this scenario would be to cause the sites to write down detailed rules about content, and enforce them fairly on all sides. Right now there are simply too many flagrantly biased or inaccurate moderation decisions on these sites, and many of them seem to be motivated by political or economic reasons. It's one thing to ask that sites be the "public square" because of the huge gift of liability protection that section 230 grants. This argument has not held up in courts. It's a totally different thing to require sites play by their own rules, and hold them accountable for each hypocritical moderation action that genuinely hurts the "little guy"




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: