Hacker News new | past | comments | ask | show | jobs | submit login

> It means not just pictures of people without clothes on, but actual images/video of children being abused.

Who said that?

They're the same legally. People have said NCMEC's database includes entirely legal images. And groups like NCMEC have tried to rename child pornography for years.




But the point is there’s an existing database. Either you have a photo in that database, or you don’t. So your kids naked photos don’t get triggered. And even if that photo is legal, not only is there human verification and minimum quantities required, it would be very strange for you to have in your iCloud photos that were part of CP collection sets regardless of content.


Their point was CSAM means especially bad CP. Your points were different.

It's a perceptual hash. Matches don't have to be exact. And people have engineered collisions for other perceptual hash algorithms. What a computer sees and what a person sees can be very different.

People have said the human verification just involves the visual derivative of the suspect image. Apple didn't explain what that is really. And human verification doesn't reassure people who object to anyone viewing their private photos without their consent for any reason.

People have said any images collected during an investigation go in the database. Do you think people who collect explicit photos of 17 year olds don't collect explicit photos of 18 year olds?


Sorry. Either the posts got updated or I got the thread confused as others were trying to say that (re: hash versus classifier).

Re hashing, Apple claims 1:1 trillion likelihood of a collision. These kinds of systems are not rolled out lightly, and even if that number is wrong, it’s feels unlikely to me that it’s too far off. If it is and it has too many false positives, this will get noticed and the system pulled until it’s fixed and at the required false positive rate.

Ultimately beyond Apple if you’re getting arrested and confront a judge, you’d expect humans at that point to look at the evidence. In fact, I’d expect the DA or whomever to similarly look at the photos at that point. You can’t be sentenced without evidence in the US legally (how all of this works in another country is another matter).

If there is legit porn that’s 18+ mixed in this database, and someone ends up being charged because of it, fights, and wins, I’d expect a number of counter lawsuits to follow. To me it seems incredibly unlikely that non-CP is not only going to be a significant part of that database (including 17 year olds versus the more likely 7 year olds), but you’ll be saving it to your iCloud photo roll. There’s so much legal porn out there, in such vast quantities, this hypothetical situation you describe I’m not sure will ever actually occur.


Apple's claim is unverifiable. It counts for nothing. And people with relevant experience have called it bullshit.[1]

Even just arresting someone means separating them from their children. Preventing them from working if their job involves children. Seizing all their electronics for months. Violating the privacy of their files and belongings. Legal costs. Possibly media reports. Putting innocent people through this because a secret algorithm said a secret database matched a secret number of times is unacceptable.

Several people have claimed false positives are in the database. Including someone who verifiably worked with it.[1]

US prosecutors have absolute immunity. Any civil suit would be dismissed swiftly.

People don't collect random subsets of all pornography ever made. Some is much more popular. People have specific tastes. Photo sets exist.

[1] https://www.hackerfactor.com/blog/index.php?/archives/929-On...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: