Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I guess I would rather see a system that focuses on scanning all images for illegal content (presumably there are services where you can hash the image and check for known child porn images, for example), and focus on tagging all other images for certain things (like David Hasselhof's bare chest or whatever concerns your users). Give the users tools to flag images as illegal content, or for misapplied or missing tags, and the tools to determine which type of content they wish to see. Prioritize handling known illegal content found earlier, then user-flagged possibly illegal content, then missing or misapplied tags. Handle DMCA take down requests according to the letter of the law.

Let the users help you, and let them choose what they want to see. Use conservative defaults if you wish, but trying to guess what users might find objectionable and filtering that out ahead of time sounds like a losing proposition to me. They'll tell you what they don't like. When they do, make a new tag and start scanning for that with the AI goodness.

Of course, this is what I would like to see as a user. I'm probably an atypical user. And I'm not the person about to bet their life savings on a start-up, either, so take this with a grain of salt. I just wish that content providers would stop trying to save me from the evils of life or whatever their motivation is.



This addresses only a small subset of the problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: