Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Prosecutors would have to prove you upvoted knowing it was "the wrong porn".

Why you want to prosecute people clicking the arrow button on the website IN THE FIRST PLACE? That's some 1984 shit right here. Prosecution for unlawfully liking a picture, wtf is this shit ?

>> All site owner should be required is to follow the law here like disclose the uploader's data to the authorities if what they did is deemed illegal

>And take it down? With that stipulation it is even more than what I said because best effort moderation is not neccesarily reporting to the government. Just bans and removal are best effort right? Once the content is taken down, I don't think there should be a reporting requirement unless a criminal law was broken.

If it is illegal it should be taken down, if it is not it should not. Sharing sextape without consent of the partner is illegal AFAIK, offender gets reported and court/police orders site to take it down.

Moderation have NOTHING to do with it.

>> The incentive is users wanting to use your site because it is not filled with shit.

>Are you kidding? Users will flock specifically for shit and pay for it too. I

I should elaborate on that. "It is not filled with shit they don't like". Sites like reddit give user a good chance to find a space where there is a lot of stuff they like and that's why they're being popular.

There might be subreddit filled with everything you hate but you ain't getting it in suggestions by algorithm (as suggestions is not main front of reddit, unlike twitter/facebook) and you don't have to go there

> If I bully you on HN for example, what is the incentive to ban me other than the goodwill of moderators?

I mean... better written social media sites just have [Block] button where I can choose to not see given person's ramblings.

But giving user the power to choose what they do or do not want to see seems to be passe. Altho to be fair twitter does have some features around it. But people want to decide what other people don't want to see so would rather bother moderators with false positives than to just mute the person they disagree with.

It would actually be interesting to see stats on how many "bad because you were naughty" ends up in just that user creating new account, but that would be probably a very hard stat to get.

> My suggestion is to provise that incentive so that moderation happens not just when it is convenient for the site operator.

Your suggestion is giving moderators so much work the site would be unprofitable. How on earth you'd "verify the provenance" of a meme subreddit? Or you want moderators to google image search every image posted ?

And moderation is always convenient for site operator, especially if they don't pay for it.

"Curation" is the bigger problem; moderation is by definition weeding out the bad and steering the discussion into, well, discussion instead of shouting match.

But fucking with recommendations to show whatever your corporate interest aligns with, recent examples being japanese twitter being filled by anime and mechs instead of polarizing politics after musk fired their content team

> Moderation is why you are not liable to begin with.

You are liable for shit you say on internet, moderation or not. This law was about moderators and site owners not being liable for stuff users put on them, not users themselves

>> Also "harmful content" is nowadays waaay to easily interpreted as "the thing I don't like".

>Not really, there are clear enough legal definition. Of course there is criminal law but in addition to that, whatever in state or federal law that is already defined as action or speech for which you can become liable, now the site owner also gets liable in addition to the original poster if and only if the site owner refuses to do moderation. So if "the thing I don't like" is something I can sue you for, I would need to win a suit against the original poster first and then sue the site owner for refusing to have moderation capabilities or not reviewing flagged posts and taking them down.

If only take downs were for actual legal cases we'd be in much better place. What social sites define "harmful content" and what law does is vastly different, like the recent disaster with vaccine communication.

* [1] https://www.forbes.com/sites/olliebarder/2022/11/14/japanese...



> better written social media sites just have [Block] button where I can choose to not see given person's ramblings.

Or I could sue you for your ramblings. Also, blocking means I don't see the content, others seeing the content is harmful in way of slander or illegal porn,etc....

> If it is illegal it should be taken down, if it is not it should not. Sharing sextape without consent of the partner is illegal AFAIK, offender gets reported and court/police orders site to take it down. > Moderation have NOTHING to do with it.

Moderation has everything to do with it. If you don't take down after knowing it is illegal you should become criminally liable and where the law allows also civilly liable.

> This law was about moderators and site owners not being liable for stuff users put on them, not users themselves

Exactly but when they knowingly refuse to take down content they know is considered illegal or civilly damaging they should be held accountable.

> There might be subreddit filled with everything you hate but you ain't getting it in suggestions by algorithm (as suggestions is not main front of reddit, unlike twitter/facebook) and you don't have to go there

See above. Me being bothered has nothing to do with it. If you slander me promote content that is in anyway damaging to me I will sue you and complain to moderators. If mods refuse to take action then let the court decide if they should be held co-conspirators. There are so many types of such damages but an extreme example that is pushing for repeal of 230 is cp or revenge porn. Would you tell someone to block users that post naked pictures of them or their child? If they ask a subreddit and reddit admins to take it down and they don't comply then both reddit execs and mods should be held criminally and civilly liable.

> Your suggestion is giving moderators so much work the site would be unprofitable. How on earth you'd "verify the provenance" of a meme subreddit? Or you want moderators to google image search every image posted ?

They should limit members to a volume of managable daily flagged posts they can review. I never said meme provenance that comment was specifically about porn. You are grasping at straws here. But if someone made a meme that is damaging to me, I should be able to sue mods and site operators when they refuse to take it down even after I flagged and reported the harm being done to me. Since their refusal to moderate is an explicit decision and the damage inflicted is well known to them (by my flag/report). I don't care if a bunch of subreddits die off I care more about actual harm to people being reduced.

> And moderation is always convenient for site operator, especially if they don't pay for it.

Again with this shit. Have you never heard of stormwatch, kiwifarms,4chan and 8chan? How is moderation profitable for them or for random porn sites? Even with reddit, engagement is profitable not moderation. Most of reddit is porn, is it profitable for them to moderate that? Users don't care who else gets hurt so long as it isn't them or their "group".

> If only take downs were for actual legal cases we'd be in much better place. What social sites define "harmful content" and what law does is vastly different, like the recent disaster with vaccine communication.

I only care about what the law has already defined as harmful. In a way it would give them guidance so they won't have to be arbiters of what is harmful. Is calling someone a racial epithet illegal? No, but You can get sued for defamation and "emotional distress" or whatever depending on the state so they can now use that as a guidance instead. But they can still moderate on their own terms in addition to the law if they choose to do so just not in ignorance of it.

> Why you want to prosecute people clicking the arrow button on the website IN THE FIRST PLACE? That's some 1984 shit right here. Prosecution for unlawfully liking a picture, wtf is this shit ?

Alright, how about a stipulation that your like generated some material gain to the site or original poster? Because stuff like materially supporting a terrorist for example is illegal so if a terrorist posts white supremacist violence or jihadist content people who upvote that get prosecuted for materially supporting a terrorist for even a cent of ad profits.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: