While I applaud the goal, this is likely not to achieve it on its own. Those that should be affected are probably not using Messanger to begin with and if they are they can easilly switch to other communication apps.
It is also not quite clear if Apple is taking a moral or legal stand here. If it is legal then this could in the future open doors to:
- Scanning for other types of illegal content
- Scanning for copyrighted content (music, images, books, ...)
- Scanning your iCloud files and documents
- Scanning emails to make sure you are not doing anyting illegal
If it is morally driven and Apple wants to really take a stand against any CSAM material on its devices, they would really have to do it at a system level, monitoring all data being transferred (including all communications, all browsing etc) so this could just be the first step.
A moral-based agenda would be much easier for broader public to accept, while a legal-based agenda could lead to other kinds of privacy-intruding consequences. And even a moral-based agenda would still be a precedent as ultimately we do not know what are Apple's "moral values" and what would it be ready to intrude user's privacy over in the future?
Seems like a slippery slope for a company to take, any way you turn it, specially if privacy is one of your main selling points.
Another thought: if we as a society agree that CSAM is unacceptable, why not globally prevent it at an internet router level? edit: jetlagged... we can't because data is encrypted. It has to be at client level, pre-encyption.
It's not to prevent child abuse, since passively looking at images is not, per se, abuse.
It's also not to limit the making of child pornography, since this will only search for already existing and already known images that already exist in government databases.
If you make new images that are not yet in said databases, you're fine.
I'm not sure what the actual goal is (project a virtuous company image, maybe?), but the result could very well be the opposite of what people think.
Apple is a legal entity not a moral one. It may have moral employees, but at best Apple itself is amoral. It will do what the law allows (or compels) it to do.
This feature absolutely will be used for human rights abuse by countries like China, just like they have asked Apple to abuse their platform in the past. Why? Because those abuses are legal there, and capitulation will be the only way those governments will allow them continue to sell in their lucrative marketplace.
It is also not quite clear if Apple is taking a moral or legal stand here. If it is legal then this could in the future open doors to:
- Scanning for other types of illegal content
- Scanning for copyrighted content (music, images, books, ...)
- Scanning your iCloud files and documents
- Scanning emails to make sure you are not doing anyting illegal
If it is morally driven and Apple wants to really take a stand against any CSAM material on its devices, they would really have to do it at a system level, monitoring all data being transferred (including all communications, all browsing etc) so this could just be the first step.
A moral-based agenda would be much easier for broader public to accept, while a legal-based agenda could lead to other kinds of privacy-intruding consequences. And even a moral-based agenda would still be a precedent as ultimately we do not know what are Apple's "moral values" and what would it be ready to intrude user's privacy over in the future?
Seems like a slippery slope for a company to take, any way you turn it, specially if privacy is one of your main selling points.
Another thought: if we as a society agree that CSAM is unacceptable, why not globally prevent it at an internet router level? edit: jetlagged... we can't because data is encrypted. It has to be at client level, pre-encyption.