Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It drives me absolutely nuts when I encounter a video platform upstart that has not adequately prepared (or prepared at all) for the inevitable onslaught of undesirable and illegal content that users will soon start uploading if the platform has really any traction at all. No UGC site/app is immune. Even when prepared, it is an eternal, constantly-evolving battle as users find more clever ways to try to hide their uploads or themselves. If you aren't ready for it at all, you may never be able to catch up. And while a lot of the undesired content could just be really annoying to get rid of, some is catastrophic -- a user uploading a single video of something like child porn that is publicly visible can be the death knell for the platform.

I’m going to go ahead and refute some of the counterarguments I’ve heard a million times over the years just to get it out of the way.

“It could be a while before it’s necessary.”

People seeking to upload and share unsavory content are constantly getting kicked off every other platform for doing so, and thus are always on the lookout for something new to try where they might be able to get away with it, at least for now. They are the earliest adopters imaginable.

“Just let users flag content”

Lots of issues here, but here’s a couple big ones.

1. You cannot afford something like child porn to be visible long enough to be flagged, or for it to be seen by anyone at all. If something like this gets uploaded and is visible publicly, you could be screwed. I worked on a video platform once that had been around a couple years and was fairly mature. One video containing child porn managed to get uploaded and be publicly visible for about one minute before being removed. It was a year before the resulting back-and-forth with federal agencies subsided and the reputation of the platform had recovered.

2. People uploading things like pirated content tend to do so in bulk. You might see people uploading hundreds of videos of TV shows or whatever. It may exceed legitimate uploads in the early days of a platform. You do not want to burden users with this level of moderation, and actually they aren’t likely to stick around anyway if good videos are lost in a sea of crap that needed to be moderated.

“We’ll just use (some moderation API, tool, etc.)”

Yes, please do, but I’m not aware of anything that works 100%. Even if you filter out 99% of the bad stuff, if the 1% that gets through is kiddie porn, say goodnight. These tools get better all the time, but users who are serious about uploading this kind of stuff also continue to find new and interesting ways to trick them. As recently as 2017 a pretty big video platform I worked on was only able to stop everything with a combination of automated systems as well as a team overseas that literally checked every video manually. (We built a number of tools that enabled them to do this pretty quickly.)

Content shouldn’t be moderated

Child porn? Hundreds of pirated episodes of Friends instead of legitimate user videos? (Even if you are pro-piracy, you don't want to pay to host and serve this stuff, and you don't want it to distract from legit original content from your users.) What about when some community of white supremacists gets wind of your new platform and their users bomb it with all their videos?

Do not take this stuff lightly.

EDIT: I've spent most of the last decade as an engineer working on UGC and streaming video platforms



> 2. People uploading things like pirated content tend to do so in bulk. You might see people uploading hundreds of videos of TV shows or whatever. It may exceed legitimate uploads in the early days of a platform. You do not want to burden users with this level of moderation

Not to mention that viewers aren't likely to flag the complete discography camrip of My Little Pony unless they're stupid or have an axe to grind (either against the IP that was uploaded, piracy in general, or the specific uploader). The viewers are often drawn to platforms specifically because they are flooded with piracy in their early days.


Exactly. Setting aside all legal concerns and whatever anyone's philosophy is about piracy or moderated content, you still have the enormous concern about what kind of community you are fostering and what kind of people you are attracting based on what content you allow to be surfaced.


All the Reddit alternatives are each an example of why the early community matters so much. Being a piracy haven is probably the "best" outcome in terms of community-building compared to all the other common fates of low-moderation websites in growth mode.


How about accepting stuff manually before publish and only for paid users? :P


> It was a year before the resulting back-and-forth with federal agencies

You're blaming lack of content moderation and not a law enforcement system that holds you responsible for something you had no control over when it actually failed to do its own job in this case?


> a law enforcement system that holds you responsible for something you had no control over when it actually failed to do its own job in this case?

Investigating these issues is their job. They don’t show up assuming the site operator is the guilty party, but they do need their cooperation in collecting evidence so they can pursue the case.

It’s analogous to a crime being committed on your property. They don’t show up to charge the property owner for a crime someone else committed, but they do need access to the property and cooperation for their investigation.


We weren't held responsible, but it was still investigated and required our cooperation and was not the best use of our resources. Honestly, the public reputation part was far and away the more unfortunate consequence.

Trust me, I have numerous concerns around the legal issues and the chain of responsibility, but what choice do you have? Are you going to start a fight with them out of principle and hope this works out in your favor? While still devoting the time and energy to the video platform you set out to build in the first place?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: