Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No. I'm saying that Google's system of using machine learning to look for images of naked children and reporting parents to the police when there is a single false positive (instead of only looking for known examples of kiddie porn) is exceptionally problematic.

The fact that Google refuses to put humans in the loop when they know the decisions made by their algorithms on this and other subjects are highly unreliable simply adds insult to injury.

Loosing access to your account because Google's algorithm screwed up is bad enough. Being accused of child abuse because you took a picture of your child's first bath is a bridge too far.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: