Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think we should start with removing the immunity that large platforms possess against relevant criminal prosecutions. So let's take for example suicide, in some jurisdictions driving another individual toward suicide is a criminal matter. If evidence can be put forward in a court that they used a lot of social media and that the algorithm contributed to that suicide, well maybe the publisher should be getting prosecuted.

Do I expect that some social media companies are really going to struggle to continue to operate at their current scale because of these changes? Yes I 100% expect it and I think it's great. It may lead to a smaller and more personal web. Your business model has no inherent right to exist if it harms people. Maybe, for example, you will need to hire more humans to handle moderation so that you stop killing people, and if humans don't scale, well, too bad, you're going to get smaller. We regulate gambling, tobacco etc. to limit the harm they do, I don't see any difference with social media.

To have the biggest impact without stifling innovation we can start by applying this rule to platforms which are above a certain revenue level. There is likely a combination of legislative and judicial action here in that there may already be crimes on the books which these platforms are committing, but the judiciary has not traditionally thought of a corporation being the person who committed that crime, certainly not at scale against thousands of victims. In other cases we may need to amend laws to make it clear that just because you used an algorithm to harm people at scale, doesn't make you immune to consequences from the harm you caused.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: