Hacker News new | past | comments | ask | show | jobs | submit login

"They're going to scan your phone and probably have someone review any photos with a lot of human flesh in them" would be enough to get a lot of non-technical users to take notice.

That would get a lot of people nervous. Let alone anyone smart who thinks through the implications here of how far the line is being pushed on how public your phone is.




Simply turning off iCloud Photos will ensure that photos on your iPhone are never scanned. Why are you trying to make this thing about photos stored on-device? Photos in iCloud have always been available to Apple through iCloud backups. If you are concerned about privacy, turn it off.

"And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account."


So if an offender turns off iCloud then this move will absolutely be useless??

How would that help catch them if they can simply flip the switch?


> So if an offender turns off iCloud then this move will absolutely be useless??

No on-device photo scanning unless iCloud Photos is enabled. Isn't funny when you get the most important aspect wrong?

> How would that help catch them if they can simply flip the switch?

They never claimed that this would help "catch them" if they are not using iCloud Photos.


It would, but luckily that's not what's happening


Go read the announcement, the "CSAM detection" heading [0]. It is exactly what they are doing.

Although they're assuring us that they don't make mistakes. The technical term for that is either going to be "blatant deception" or "delusion". Apple are impressive but they haven't developed a tech that can't make mistakes.

[0] https://www.apple.com/child-safety/


That isn't what they're doing at all. You have significantly misunderstood or conflated different sections.

Though I don't blame you at all: Read through the various hysterical posts about this and there are a lot of extraordinary misrepresentations.


Ah, I see what you're getting at. They're currently hashing for specific photos.

I don't care. There is no way on this good earth that law enforcement is going to let them get away with that. They're claiming that they will be scanning things that are obviously child porn and ignoring it. That isn't a long term stable thing to be doing - if they think scanning for anything is ok there is no logical reason to stop here. So they probably aren't going to stop, and they certainly aren't going to announce every step they take to increase the net.

And their 1:1,000,000,000,000 number is still delusional. The system is going to produce false positives. There are more sources of error here than the cryptographic hash algorithm.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: