These scenarios sound rather like "the wrong side of airlock" stories[1]. Why would China go through an elaborate scheme with fake child-porn hashes, when it can already arrest these people on made-up charges, and simply tell Apple to provide the private key for their phones, so that they can read and insert whatever real/fake evidences they want?
Because they don't know who to arrest yet. The idea isn't to fabricate a charge, it's to locate people sharing politically sensitive images that the government hasn't already identified.
> Because they don't know who to arrest yet. The idea isn't to fabricate a charge, it's to locate people sharing politically sensitive images that the government hasn't already identified.
And maybe even identify avenues for sharing that they haven't already identified and monitored/controlled (e.g. some encrypted chat app they haven't blocked yet).
China does not really need Apple to do much. They already make installation of some apps mandatory by law. Also, some communication must be done with WeChat and so on. They have pretty good grip already.
> They already make installation of some apps mandatory by law. Also, some communication must be done with WeChat and so on.
Can you give some examples of this on the iPhone?
Also, it seems like on the spying front [1] (at least publicly, with Apple), they've preferred more "backdoor" ways to get access over more overt ones, so this scanning feature might encourage asks that they wouldn't have otherwise made.
[1] this contrasts with the censorship front, where they've been very overt
Agreed. If China can force Apple to do almost anything by threatening to ban iPhone sales, why bother with fake CSAM hashes? That just adds an extra step. It's not like the Chinese government needs to take pains to trick anyone about their attitude toward "subversive" material.
In both cases people are stretching to come up with hypothetical scenarios about how these systems could be abused by a government ("they could force Apple to insert non-CSAM hashes into their database" or "they could force Google to insert a backdoor into your app") while completely ignoring the elephant in the room: if a government wanted to do these things, they already have the power to do so.
If your concern is that a government might force Apple or Google to do X or pull product sales in their country, whether Apple performs on-device CSAM scanning vs scanning it on their servers, or whether Google signs your app vs you signing it doesn't materially change anything about that concern.
The outrage around this particular situation is even more confusing to me because you can opt out entirely by disabling iCloud Photos, and if you were already using iCloud Photos then the scanning was already happening on Apple's servers anyway, so the only actual change is that the scan now occurs before instead of after the upload.
Exactly. Apple can already ship literally any conceivable software to iPhones. Do people really think their plan was to sneak functionality into this update and then update the CSAM database later, and they would have gotten away with it if it weren't for the brilliant privacy advocates pointing out that this CSAM database could be changed over time? That's pretty ludicrous. If the Chinese government wanted to (and thought it had sufficient leverage over Apple), they could literally just tell Apple to issue a software update that streams all desired private data to Chinese government servers.
Not quite. Those are still ostensibly servers located in China but not directly controlled by the government (edit: apparently the hosting company is owned by Guizhou provincial government). But yes, this is precisely my point. Any slippery slope argument about Apple software on iPhones is equivalent to any conceivable slippery slope argument about Apple software on iPhones. If you're making one of these arguments, you're actually just arguing against Apple having the ability to issue software updates to iPhones (and by all means, make that argument!).
China's laws are such that there's no need for them to obtain a warrant for data housed on servers of Chinese companies. Not only do they not need a warrant but companies are required to facilitate their access. While the servers aren't controlled by the Chinese government, government law enforcement and intelligence agencies have essentially free access to that data.
> ostensibly servers located in China but not directly controlled by the government
"ostensibly" is the key word there. If the datacenter is physically located in China, then there's a CCP official on the board of the company that controls it.
So your argument boils down to since Apple can already install software without us knowing, we shouldn't worry about a new client-side system that makes it substantially easier for nation states to abuse? I don't find that argument the least bit compelling.
I’m not saying that we shouldn’t be concerned with Apple actually launching things that are bad. I’m saying we shouldn’t make arguments of the form “this isn’t bad yet, but they could change this later to make it bad.” Because obviously they can change anything later to be bad. If the system as currently described is a violation of privacy, or can be abused by governments, etc. then just make that argument.
Because Apple has already built that functionality, and it exists? What alternative dragnet currently exists to identify iOS users who possess certain images? This would be code reuse.
China or any government adding collisions would be to use Apple's system as a dragnet to find users possessing the offending images.
The way it would work is the government in question would submit legitimate CSAM but modified to produce a collision with a government target image. Looking at the raw image (or a derivative) a reviewer at Apple or ICMEC would see a CSAM image. The algorithm would see the anti-government image. So Apple scans Chinese (or whoever) citizens libraries, finds "CSAM" and reports them to ICMEC which then reports them to the government in question.
Every repressive government and some notionally liberal governments will eventually do this. It likely is already happening with existing PhotoDNA systems. The difference is that's being used by explicit sharing services where Apple's new system will search for any photo in a user's library regardless of it being "shared" explicitly.
> So Apple scans Chinese (or whoever) citizens libraries, finds "CSAM" and reports them to ICMEC which then reports them to the government in question.
If Apple finds that a particular hash is notorious for false positives, they can reject it / ask for a better one. And they’re not scanning your library; it’s a filter on upload to iCloud. The FUD surrounding this is getting ridiculous.
Look, I said it in another post, it is not Apple’s job to act as an arm of law enforcement. The same way it is not either of our jobs to be vigilante sheriffs and police the streets.
We’re talking about a company that makes phones and computers, and sells music and tv shows via the internet. Does that matter at all?
How about this. All car manufacturers must now wirelessly transmit when the driver of the car is speeding immediately. How about that?
Let’s just go all out and embed law enforcement into all private companies.
This is fascism, the merging of corporations and the government.
Have we established that a US NGO is accepting "CSAM" hashes from China or that they are cooperating with them at all? That seems unlikely and Apple hasn't yet announced plans with how they're going to scan phones in China, I mean wouldn't China just demand outright to have full scanning capabilities of anything on the phone since you don't have any protection at all from that in China?
> Have we established that a US NGO is accepting "CSAM" hashes from China or that they are cooperating with them at all?
I believe Apple's intention is to accept hashes from all governments, not just one US organization. One of their ineffectual concessions to the criticism was to require two governments provide the same hash before they'd start using it.
China can definitely find a state government requiring some cash injection to help push the hash of a certain uninteresting square where nothing happened into the db
Sure, but Apple receives far less backlash if the system is applied to all phones and under the guise of "save the children". This would allow Apple to accommodate any nation state's image scanning requirements, which guarantees their continued operation in said markets.
The main announcement was Apple was getting hashes from NCMEC but they also listed ICMEC and have said "and other groups". Much like the source database for the image hashes the list of sources is opaque and covered by vague statements.
Maybe, but it probably stretches farther back than that, maybe even to before sliced bread or cool beans. Ten years before The Hitchhiker's Guide there was a robot, HAL, who woudn't open the airlock for a particular astronaut.
They wouldn't. They would force apple to add hashes to things that the CCP doesn't like such as winnie the pooh memes and use turn Apple's reporting system into yet another tool to locate dissidents. How would Apple know any different. Here are some hashes, they are for CSAM trust us. They built a framework where they will call the cops on you for matching a hash value. Once governments start adding values to the database they have no reasonable way of knowing what images those actually relate to. Apple themselves said they designed it so you couldn't derive the original image from the hash. They are setting themselves up to be accessory to killing political dissidents.
[1] I'm stealing the expression from this excellent article: https://devblogs.microsoft.com/oldnewthing/20060508-22/?p=31...