Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm really surprised that they tried it in the first place. I'm sure governments want it (China would love to see your Winnie the Pooh meme stash), but Apple is pretty good at fighting the US government and should have felt 100% free to say "in the absence of a law that compels us to write software, which is unconstitutional btw, we're not doing it". They have done it many times, so it felt really out of character. There must have been some contract / favor they were going after, and the opportunity must have expired. (I'm sure some large department of the federal government has some shiny new Android phones today.)

It would be interesting to figure out the real story.

My favorite part of the whole saga is that leaked letter that said "the screeching voice of the minority" will kill the project. We did indeed, and I'm happy to screech again the next time the government wants a tool they can use to scan my phone without a search warrant.



> in the absence of a law that compels us to write software, which is unconstitutional btw

In their 2016 dispute with the FBI, the gist of Apple's 1A and 5A arguments were:

Writing software is a form of speech within the meaning of the First Amendment. Forcing Apple to create software would therefore be compelled speech and so the order to do so must be narrowly tailored to obtain a compelling state interest (see: "strict scrutiny").

The FBI has a legitimate interest in investigating and prosecuting terrorists, but their request does not pass strict scrutiny:

1. The government was only vaguely speculating that there might be something useful on the iPhone, but what was requested would have far reaching adverse consequences beyond just that single device.

2. Apple publicly values privacy and data security. Forcing Apple to create software (i.e. compel speech) that runs contrary to their values is a form of viewpoint discrimination.

3. Apple is a private party in the matter, far removed from the crime itself and the request is a lot of work. So conscripting Apple to assist the government in the matter would constitute an undue burden and therefore be a violation of Apple's substantive due process rights.


I would really like the defence to be based on our rights as human beings, rather than 'placing undue burdain on a corporation'.

Suppose next time NSA writes the code for Apple and Apple just has to sign it, will the defence stand uo in court?


Due process and speech are among our rights so that’s literally what this is

If NSA has a way to break into a device without violating any parties’ rights, they would do so.

Even if NSA wrote the code you might argue the [Apple just has to sign it] step violates Apple’s rights.


However, it's repeatedly been show that they either 1) Break our rights w/ impunity or 2) They go to rubberstamp courts that practically never deny a request to invade privacy.


Wasn't that how FISA warrants were enforced?


Presumably yes, for both of the same reasons, or at least the first one.


Frankly the FBI should have just requested the bootloader signing keys, bootloader documentation and just write it themselves. The only protection then is the 5A - and apple can't take it because it is them on the stand.


My belief is that the Feds already had the tools they needed to get the information they needed off the phone, but wanted to use the situation to create a new standard of practice as it was too juicy to pass up—never let a horrific event go to waste.

Luckily, it didn’t work in the way they hoped or at least to my knowledge, it didn’t.


> I'm really surprised that they tried it in the first place.

Compare the approach Apple floated in their CSAM white paper to what Google is already doing today.

Google:

Is scanning for images of any naked child (including images of your own children that Android automatically backed up) and reporting parents to the police for a single false positive.

https://www.nytimes.com/2022/08/21/technology/google-surveil...

Apple's proposed system:

>Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes.

The device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.

Using another technology called threshold secret sharing, the system ensures that the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images.

https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

Apple designed a system where they don't know anything about the data you upload to their server, and where they don't do anything at all about positive matches, until you cross the threshold (later revealed to be 99 images) that match against the database of known kiddie porn.

Even then they have a human review the images you shared before taking further action to protect against the possibility that there might be 99 false positives on a single account.

vs.

Google's system where taking a picture of your own child at the request of your doctor can result in Google reporting you to the police for a single false positive and deleting your account with no human in the loop.


The problem is that Google’s system “works better” from the point of view of law enforcement. By that I mean it’s much less restrictive, and will find “novel” CSAM.

The problem with Apple’s privacy-conscious approach is that it conceded the fundamental principle and agreed that scanning private (unshared) photo repositories was reasonable. Having conceded that point, everything is just a confusing technical argument about effectiveness-versus-restrictions.

Apple would have been under continuous law enforcement pressure to enhance their limited technology so that it was “as effective” as Google’s ML-based scanners. They might have resisted the pressure for a while, but once you’ve conceded the easily-understood principle then all you have to respond with is dull technical arguments that society at large won’t understand. Law enforcement would have (correctly) argued that Apple had agreed that CSAM scanning of private files was justified, so why are they using a creaky old technology that can’t find novel CSAM and is letting bad guys get away with child abuse? Eventually law enforcement would have won that battle and Apple would have been forced to deploy an ML-based scanner, which would have undermined the thoughtful privacy protections deployed in their first version.


> The problem is that Google’s system “works better” from the point of view of law enforcement.

Reporting parents to the police for a single false positive is not better in any way.

Law enforcement does not have unlimited resources that can be wasted every single time Google's algorithm screws up.


> Law enforcement does not have unlimited resources that can be wasted every single time Google's algorithm screws up.

Are you arguing from LE perspective? Or from taxpayer perspective? LE is happy to enforce anything that gives them more power (vide civil forfeiture laws). And signal-to-noise problem is not really a problem if real signal is political expediency. Selective enforcement is the way.


All you're saying is that Google needs to do a more careful human review before reporting to law enforcement. This is not a strong argument for privacy-preserving tech.


No. I'm saying that Google's system of using machine learning to look for images of naked children and reporting parents to the police when there is a single false positive (instead of only looking for known examples of kiddie porn) is exceptionally problematic.

The fact that Google refuses to put humans in the loop when they know the decisions made by their algorithms on this and other subjects are highly unreliable simply adds insult to injury.

Loosing access to your account because Google's algorithm screwed up is bad enough. Being accused of child abuse because you took a picture of your child's first bath is a bridge too far.


Apple's proposed system did scanning on your device, whereas google's system does scanning in the cloud. That is the distinction. A person's own hardware shouldn't be used to report them to the police. You cannot interpret technology like this in terms of what it is applied to today, because its use will be broadened in the future.


> Apple's proposed system did scanning on your device, whereas google's system does scanning in the cloud. That is the distinction.

No, the distinction is that Apple's proposed system only scanned photos you uploaded to publicly accessible iCloud albums and did so in a way that not even they had access to the results until you crossed the "99 images that match known kiddie porn" threshold. Even then, they had an actual human being review the situation to make sure there weren't 99 false positives on a single account before calling the police.

There was absolutely no possibility of calling the police for a single false positive, which is what Google is already doing today.


False positives are inherent to the justice system. While unfortunate, they do not cross any lines or send us down any slippery slopes. They are priced in. Thus, "what about google" is a mere "what-about-ism". People are ocassionally falsely accused, charged, and even convicted of crimes. That is not what is under discussion here.


> False positives are inherent to the justice system

This is in no way an excuse for Google calling the police to report child abuse because their algorithm made a bad call, and Google doesn't want to hire human beings who can intervene when their algorithms very frequently screw up.


Google is also bad. We are agreed on that.


I am against CSAM as much as anyone but NCMEC and their database is a total farce. They produce mountains of unactionable reports, false positives, and their (or was it Microsoft’s) weird hashing thing was visually identifiable as CSAM itself and reversible. That letter was just a whole mask off moment for them.

I wouldn’t be surprised if NCMEC had a hand in starting Pizzagate tbqh. I have no proof and I don’t think that they did. But if it broke the news tomorrow I would be like, “Nah that’s totally believable.”


> They produce mountains of unactionable reports, false positives, and their (or was it Microsoft’s) weird hashing thing was visually identifiable as CSAM itself and reversible.

Are there examples or reports of this? I hadn’t heard this before. (It’s not hard to believe though, eg TSA airport security)


This paper [1] cites some stats from the UK over one year (I think 2019) - from about 29.4m reports to NCMEC, 102,842 were referred to the NCA, of which 20,038 were referred to policing agencies, which led to 6,500 suspects, and about 750 of those suspects were prosecuted, which it estimates makes up about 3% of all prosecutions for indecent image offences. That comes out to about 0.7% of NCMEC reports to the UK's NCA leading to prosecution.

I read another article that linked to stats from Irish and Swiss federal police, and both of them reported discarding about 80% of NCMEC's reports to them in the first stage as not being at all criminally relevant, but I can't find the link right now.

1. https://www.cl.cam.ac.uk/~rja14/Papers/chatcontrol.pdf


> That comes out to about 0.7% of NCMEC reports to the UK's NCA leading to prosecution

That doesn’t imply the rest was unactionable. Certainly, the reductions from 20,038 to 6,500 and from there to 750 aren’t due to this tech. The first could be because it produced multiple hits for one person, the second because of policies of the police (minors might have been warned rather than prosecuted, untraceable phone numbers ignored, and persons known to have died removed, for example)

NCMEC also would (rightfully, AFAIK) argue that they never claimed the code to be perfect, and that the earlier reductions are by design.

The real problems, IMO, are (from that pdf):

“the data do not support claims of large-scale growing harm that is initiated online and that is preventable by image scanning.”

and

“The first wave of prosecutions for illegal abuse images, Operation Ore, swept up many innocent men who were simply victims of credit card fraud, their cards having been used to pay for illegal material. A number were wrongly convicted, and at least one innocent man killed himself”


Wow, that's a Kafkaesque nightmare.


Back when the news first came out there were several blog posts posted to HN by people who host user uploaded photos at scale who had NCMEC come knocking at their door to strong arm them into participating in their programs.

They are not even a proper government agency yet they walk around with the balls to pretend they are equivalent to the NSA.


It’s even worse than you can imagine. Even when you find an image… you can’t delete it right away, so as to not tip your hat that it was automated.


> I wouldn’t be surprised if NMEC had a hand in starting Pizzagate tbqh. I have no proof and I don’t think that they did. But if it broke the news tomorrow I would be like, “Nah that’s totally believable.”

If I’m not mistaken it was this kind of speculation that actually precipitated a bunch of people thinking a random pizza joint was involved in sex trafficking and ultimately led to someone opening fire with actual guns.

Maybe we could do with less unfounded speculation.


I can see how it’d be read like that. Please allow me to be clear, they did not do that. I do not think they did that. There is absolutely ZERO proof they did that.

However if they were to do something outlandish and insane like that, it would not be surprising. NCMEC is drunk on their own power and status. They are on a holy crusade and even constitutional rights mean absolutely nothing to them in their quest. The people at the top are crusaders on a holy war and there are no means that they would not use to these ends.

There are many people in the trenches who are fighting a good fight. They deserve our respect and empathy. However the people at the top…see above. NCMEC needs to be reined in, but of course the second anyone in Congress even looks at them sideways you get a screeching voice of the minority going on about the children and then you’re in the news as a child predator.

In a way, they are responsible for shaping the discourse such that democrats got knocked on as “for” CSAM which is what led to Pizzagate taking hold. But I don’t believe for a second they intentionally, directly created it. But it’s not a big stretch from where they are now.


It was not “this kind of speculation” and the comparison is weak at best


Ha! If you think that apple does not work with governments all around the world and that you can take their word at face value you're going to have a bad time.


I'm sure Apple will give me up to China in a heartbeat, but I actually don't think they'll give me up to the US. It's all about incentives; China can say "sorry, you can't buy our cheap cheap capacitors anymore" and they go out of business. The US can write them a strongly-worded letter about how they don't love freedom, and it's their constitutional right to throw that letter directly into the trash while donating to the campaign of whoever is running against the letter-writer. Pesky constitution!!

I always worry about how many FBI / NSA employees work at Apple/Google/etc. though. If you're doing a lot of illegal stuff, I'd avoid uploading the evidence to the Internet. Even someone with the best intentions writes software with bugs or hires compromised employees.


Apple is a key member of PRISM [1] and got on board just about the time their phone business started to skyrocket. Just because the media stopped talking about PRISM doesn't mean it went away. If anything I'd expect it's only grown more emboldened given the relatively tepid public response to it. I'd argue this is likely the real reason that Huawei was banned. The US 3-letter agencies agencies had the choice of bringing a Chinese company into the surveillance state web, having a phone on the market they have no control over, or simply banning those phones.

As for incentive, big tech regularly flaunts nearly every single behavior that all of our anti-competitive clauses were meant to prohibit. 'Back in the day' Microsoft was targeted, and lost, an anticompete lawsuit for bundling Internet Explorer with Windows. Now a days bundling everything in your own packages, locking down other companies into a an unescapable market where you impose a 30% tribute, buying out competitors to prevent competition, and more - that's all perfectly cool, somehow.

If Apple, Microsoft, Google, et al stopped playing ball - they'd be out of the game before nightfall.

[1] - https://en.wikipedia.org/wiki/PRISM


The entire organization that investigates leaks and supply chain tampering at one of those is formers and retirees of U.S. and allied intelligence and law enforcement communities. It is a large organization. It is itself a very small part of the organization containing it.

This is not uncommon, of course. Cops work at Kohl’s and shadow the Zuckerbergs and bounce at clubs and set security policy at multinationals. I don’t get why this is weird unless you’re implying that working in LE or intelligence automatically makes someone untrustworthy. Which is odd, since half of the peace officer or clearance processes are establishing trust and creating an enormous hill to climb to successfully violate it. The threat model is obvious, and it only takes a moment of thought to realize that “ex-CIA person” you’re looking at with a cocked eyebrow was deemed by said organizations to be trustworthy enough to represent the interests of the United States or wherever they come from. Do they get it wrong? Sure. Is it as often as you think? No. You don’t hear the successes and the ratio is way lower than you think it is.

There are hundreds of thousands of people who would disagree with your premise. Many of them have a quiet, nonzero involvement in ensuring you can safely share that opinion and eat, a majority are former military and had direct involvement in the same, and at the end of that they’d like to put the gun away and provide for their family. Why is that automatically suspect? It’s not like they’re walking out of government with an armful of implants.

Put it this way: would you rather someone that the government spent millions of dollars training in, say, cybersecurity and active threat assessment end their career by buying and operating a movie theater or by making sure the Internet and power grid keep working? I’m about as liberal as it gets and even I can get there while acknowledging that occasionally those powers are used for malevolence. I’d counter that Sand Hill is just as capable of capitalizing that malevolence as Fort Meade and arguably more successful on some axes (no pesky laws). And sure, leak investigation is a bit stupid and mildly malevolent, but supply chain isn’t, and it’s also their prerogative to run their house as they wish.

We’d all benefit from debating the policy instead of the person a bit more, I think.


Oh, I'm not making a value judgement. I think that these government agencies should absolutely try to get people into Apple adding secret undetectable backdoors that help them catch criminals. I also think Apple is completely within their rights to make them work for it.

Basically, I totally agree with pretty much everyone that people who abuse children and upload pictures of it to iCloud should burn in hell. But I don't think that Apple should be compelled to add hell to iOS 17 if that makes any sense. If the NSA wants to hire double agents that pass the Apple interview process and add sneaky undetectable code that reports those people to them, I think that's great.

I think I just like the chaos of it all. That's why nobody votes for me when I run for Congress or whatever.


You are making a value judgment, though, even in your followup. You’ve implied that employment of one of those people is prima facie surreptitious, despite a carefully-drawn picture of how nearly all of the cases you’re gesturing toward are benign and not worth your ongoing concern. Foreign governments doing exactly what you’re alleging, on the other hand…

To be honest, that opinion makes you unelectable because you’ve alienated a huge group of people (way, way bigger than you think, and across the political spectrum), partially by imagining the beltway and back rooms of FAANGs as a le Carré novel. Reality is boring. Do horror shows happen? Duh. But take PRISM, for example. PRISM is an efficient, automated legal warrant process to streamline subpoena and provenance of user data for national security purposes, nothing more, but everybody screamed oh my holy hell! because the Snowden slides didn’t contextualize that and could be taken to imply something far worse. It’s toil reduction. Ask anybody in compliance at major companies. Subpoenas are a huge bitch at scale and nearly all major companies have a PRISM equivalent facing the other direction precisely for the reason PRISM exists. It’s not cigar smoke and port mirroring but for some people that’s more fun to imagine, I guess.

Seriously, reality in all of it is far less interesting than you think, and that’s one example of many. And often the fetishization of the secrecy and imaginative scenarios takes away from the real issue, which is market forces incentivizing the erosion of privacy and civil liberties. So ironically, by worrying you’re weakening the worry.

I know this because I do.


> PRISM is an efficient, automated legal warrant process to streamline subpoena and provenance of user data for national security purposes

Dropping the euphemisms, it's a for cops to do their job easier. That's cool when they're doing it transparently, with sufficient oversight and protections against abuse, and with a high success rate. The extent to which any of those three is true is pretty dubious.

Even if you truly do believe in this, which I totally understand–I have plenty of friends who work in this area, they are really behind the whole "we protect America" thing and I'm guessing you're probably in a similar camp–the issue is even more fundamental: it's one of attitude. Just because you're doing something good doesn't mean you have the right to have your job made easier. That's just not how it works. You don't get to take nuclear secrets and work on them from your personal laptop on the beach, no matter how pure your intentions are. When you work on things that are "dangerous" you don't get to just do whatever you want. The PRISM leaks were a scandal because they made the American people feel like the government was not accountable to them, and that is the price you have to pay to do anything in this country. End of story.


Look you make lots of great points.

But the real question is “why do so many people have these suspicious attitudes towards law enforcement and CIA and so on”.

Might it be because of the Hoovers and the Nixons and the countless CIA overthrows and the fake WMDs and the support for Israel and the military industrial complex and 2008 and ..

The government has most definitely not behaved in not earned our trust. And yeah, that will lead to scenarios where when we (the people) see violations of constitutional rights we don’t rush to nuanced investigations we rush to assuming it’s another instance of a corrupt government.


This type of shit is why I am always refusing to get a clearance. I wouldn’t be able to strut around proudly about how I’m destroying the country like this guy.


Not just this country — but countries all around the world.


I think you two might be talking about different groups of people? I read your comment as "current employees of three letter agencies act as spies by embedding themselves into tech companies" while the comment you're replying to seems to be talking about "people who used to work for these places should be able to get jobs as civilians".


To be fair, I recall seeing an article here a few days ago stating that Apple was moving a significant chunk of their manufacturing out of China. I don't think Apple is that dependent on China. I'm sure Apple has lines they will not cross. Maybe giving you up to China isn't one of them, but I'm sure they have them.


Absent significant changes in world order no major company will ever be successful in severing relations with China. No nation today is positioned to be able to both bring up the manufacturing AND compel their people to work. Without the two no nation can match Chinas production and thus produce the alternative “cheap capacitors”.


Where do you think Apple is headquartered? There’s a lot the US government can do to make them heel.


Apple publishes the volume of government information requests across the world: https://www.apple.com/legal/transparency/choose-country-regi...

They try to reduce their exposure to these government information requests; by keeping personal data on the devices and storing anonymized records on their servers when possible.


I remember hearing about these types of requests that also have a gag order placed on them so companies can't explicitly inform the public about them. I haven't heard much about that sort of thing lately though. Either very good or very bad.


Keep in mind that warrants and gag orders are very much behind the state of the art. Apple might receive a request for all the information they have on you, and are absolutely compelled by law to give that to them. But unfortunately, the information is all encrypted with a key that only you know. So the government has no choice but to come to your house and hit you with a rubber hose until you tell them the key, but bad news! it turns out it's illegal to do that. They can't hit you with a rubber hose until a pesky jury has heard the evidence against you and returned a guilty verdict, which if they had, they wouldn't be probing your iCloud account.

What is more of a grey area is whether or not a court can compel Apple to hit you with some malware. The government can ask "give jrockway a special firmware so that whenever he tries to shitpost on HN, the browser locks up until he types his iCloud encryption key into nsa.gov". What's not clear is whether or not Apple has to obey that request. It seems pretty likely that the government doesn't have that power, so Apple can tell them to go away. But who knows, it's perfectly possible to pass a constitutional amendment that literally says "Apple, Inc. has to dedicate 100% of their resources to spying on readers of the dark website Hacker News", and it's all moot. I've seen Congress (or rather, the states in this case) trying to agree on less controversial things before, though, and I'm not too worried about anything changing. (What's the opposite of pro? Con. What's the opposite of progress? Congress.)

Encryption is literal kryptonite for democratic governments. Apple is putting themselves at the forefront of the debate. It will be interesting to watch!


There's no such thing as literal kryptonite.


> "in the absence of a law that compels us to write software, which is unconstitutional btw"

This isn't really settled. Writing software is not always constitutionally protected speech, and Apple being compelled to write software would probably not constitute a violation of the First Amendment. Federal wiretap law can compel companies to make it easy for the government to get data via a warrant (which necessarily entails writing code to produce that data) and has been upheld in the past. Also companies are often liable for the code they write. Both of those are examples of when code is not considered speech.


The government compelling apple to cause peoples phones to search themselves (with no probable cause that the suspects of the searches committed a crime) would be facially unconstitutional under the fourth amendment, not the first.

Compelled speech would be an interesting argument against compelled writing of software, but is definitely the weaker one here.

Edit: Oh, I see in one of the other replies GP raised the first amendment. Just take this as a reply to the idea in general...


The government need not care how Apple complies with the law, which could merely state that cloud storage providers are liable for illegal material stored there by their customers, regardless of the cloud provider's knowledge. This would be catastrophic to cloud storage in general, of course, but given that strict liability is a thing, I don't see how such a law could be ruled unconstitutional.


Trying to workaround the 4th like that might manage to make the law facially constitutional, but I'd be surprised if it made the searches conducted as a result of it valid.

By my understanding you have to avoid a "warrantless search by a government agent" to avoid violating the constitution. The "warrantless search" part is really beyond dispute, so it's the "government agent" part that is in question. In general "government agent" is a term of art that means "acting at the behest of the government", but I don't know exactly where the boundary lies. I'd be fairly surprised if any law that allowed for accidentally storing CSAM after a failed search, but didn't allow for accidentally storing CSAM content without a search, didn't make the party doing the search a government agent. If you make the former illegal, cloud storage at all (scanning or not) is an impossible business to be in.


You already have the FBI partnering with computer repair shops to do dragnet searches of customer's hard drives for CSAM when they bring their computers in for repairs.


I'd argue the Fifth Amendment should apply to mobile phone headsets, but law enforcement would pitch a fit to lose those.


Section I of the Thirteenth Amendment reads: “Neither slavery nor involuntary servitude, except as a punishment for crime whereof the party shall have been duly convicted, shall exist within the United States, or any place subject to their jurisdiction.”


You heard it here first: all regulations are slavery!


There is a difference between regulations and compelled action. The government make your ability to do X conditional on you also doing Y, but it generally can't just make you do Y.

The exceptions are actually quite few outside of conscription, eminent domain sales, wartime powers.


That sounds right to me. The government has tremendous powers. Forcing people to write computer programs isn't one of them. (They could have saved a lot of money on healthcare.gov if they had that power!)


How does conscription fit into that picture?


It fits in if 5 out of 9 supreme court justices want there to be a draft.


> This isn't really settled.

I agree with that. Especially in recent years, it's nearly impossible to tell what is and isn't settled case law.

It would be an expensive battle for the taxpayers to compel Apple to write software, I think. They've tried and failed. It will just be more expensive next time.


> Federal wiretap law can compel companies to make it easy for the government to get data via a warrant

Not the same thing. The wiretap law cannot compel any company; only licensed telcos. These companies, in exchange for license to be able to provide such service to general public get some advantages, but have to agree to a set of limitations.

I have my doubts that Apple would be a holder of such license that would make it a subject to such regulations.


> There must have been some contract / favor they were going after, and the opportunity must have expired.

Gratis dragnet scanning in return for the DOJ looking the other way on their monopolistic behavior. There's been an administration change and the new guy may be ready to work on behalf of the people.


> *"in the absence of a law that compels us to write software, which is unconstitutional btw"

I'm not sure that's true. Consider telcos, for example. They're required to write software to make it easy for law enforcement to tap into phone calls, given an appropriate warrant. Telcos aren't just allowed to throw up their hands and say, "sorry, we don't have the capability to let you do that".

And regardless, the law need not compel them specifically to write software, it need only make them liable for any illegal material stored in iCloud, with or without their knowledge. So if someone is caught with CSAM on their iPhone, for example, and it's discovered that person had iCloud backups enabled (or whatever it is), then the government would have pretty clear pretense for a warrant to search that user's iCloud files.

Granted, Apple could then implement end-to-end encryption so even they would not be able to access the files, but that might not even be a good enough defense, if it can be proved that the phone itself had uploaded CSAM to iCloud.


> They're required to write software to make it easy for law enforcement to tap into phone calls, given an appropriate warrant.

Not exactly right. They are required to make it technically possible to tap a phone line, which can be as simple as giving access to cables. It's not necessary to give investigators a tap-from-home capability and free foot massages.

> it need only make them liable for any illegal material stored in iCloud, with or without their knowledge.

I can see a mall lawyer ripping up that requirement three ways - Apple has an army of way better educated legal representation.


There is also the issue that once the government had access, that the telco could not create a fake software layer to just use this excuse. And if they create a real and required software layer, then there was an expectation to have similar access as before.

This is great reason to never grant it in the first place. It forever locks access and security at that point in time.


Just to restate the solution... it didn't scan your phone. For photos that were going to be synced to iCloud it would use the handset to create a hash and send that to Apple. It was the most private solution out there for online photos.

Google just outright scan your photos, As do Microsoft. I guess Apple probably do to, or will now.

So your "win" here was "Apple can't use your handset to help ensure the privacy of your photos while also trying to meet local laws."

Meanwhile you're fine to register your face on that exact same handset, let it sync to your iPad, have it store and share your voice print for Siri. People are happy for apple to scan their heart beat and send out alerts if they fall over.

But don't take a hash of a photo you're about to upload to Apple's servers! That is too far.

Ridiculous.


> I'm really surprised that they tried it in the first place

Might be because they saw https://ec.europa.eu/commission/presscorner/detail/en/ip_22_... coming. If that becomes law it will introduce “an obligation for providers to detect, report, block and remove child sexual abuse material from their services”

Might also because their lawyers said banning apps containing even mild sexual content while not even trying to do something like this wouldn’t hold up in court.

Being as large as they are, they can even afford to do this with the goal of seeing public outrage, ‘forcing’ them to not go forward with the idea.


i am not an apple user (wont be because they cost an arm and a leg in my place so why bother) but i WAS and AM an active opponent and the said "minority".

the problem is not CSAM for finding pedos. This opens doors for, as you said, china because "constitutional guarantees" and other legal stuff only exist on paper when it comes to fighting and oppressing dissenters. Sure there are rules in place but the best thing is that NO one should have this function in the first place. If you open the door a crack,you have to open it full.

i don't even care about "search warrant" because tell me. Do you think in china there is some sanctity in "police search warrant" issued by a judge who will consider the rights of end users with the government breathing down their necks for "results"? they have institutionalized and criminalized dissent, like with India for example and they see dissenters same as pedos, maybe more so this tech should not exist in the first place.

court orders dont mean shit when the entire system is adversarial against you. Now, if there were no means, then nothing will happen but if there was a way, they will use it.

i am a dissenter so i understand how not apple but other manufacturers would be forced to allow this because "well if apple can do it, you can too. if you dont, you wont be given license to sell here" and they might not be as honest about their actions so yeah


Do you have a source for the "not constitutional" bit? I referenced it in a post below and was looking for a source.


I'm not a lawyer, but my current theory is that code is speech, and the government cannot compel you to say something you don't want to say.

There is some case law on the saying something you don't want to say part: https://www.mtsu.edu/first-amendment/article/933/compelled-s...

And there is some case law on the whole "code is speech" thing: https://en.wikipedia.org/wiki/Bernstein_v._United_States

(Apple has cited Bernstein v. US as the reason they didn't have to write an app to unlock some mass shooter's phone for a fishing expedition, so that's why I think they agree with my didn't-go-to-law-school opinion there.)

(BTW, an aside aside. DJB represented himself in that case. Just some random number theory lover / C programmer that increased freedom for everyone in the country in his spare time. Super cool guy.)


From the wikipedia article you linked, the 1999 case that is cited as precedent and is the one that matters was not self-represented.

> Bernstein was represented by the Electronic Frontier Foundation, who hired outside lawyer Cindy Cohn and also obtained pro bono publico assistance from Lee Tien of Berkeley; M. Edward Ross of the San Francisco law firm of Steefel, Levitt & Weiss; James Wheaton and Elizabeth Pritzker of the First Amendment Project in Oakland; and Robert Corn-Revere, Julia Kogan, and Jeremy Miller of the Washington, DC, law firm of Hogan & Hartson

As for the original question, the framework for this kind of legislation is usually “we ban the hosting of CSAM, you either implement something that eliminates it or you risk being fined for breaking the law”. That may not sound different to you but it is an extremely clear distinction in first amendment terms from “the state department may deny you an export license to publish your code”. Bernstein v US was saying that the burdens to publishing were too high and so he was unable to speak. The burdens did include submitting code and ideas to the government. With CSAM scanning, you are not forced to publish your code (speak), just to do something that satisfies the ban on hosting the content. There are thousands of completely constitutional laws that require you to do stuff a certain way that may involve writing code. This would be one of those.

The San Bernardino thing is a bit more like Bernstein — the government wanted Apple to give them a software tool to unlock a phone. A bit like “give us your code and ideas” but still not quite “give us your code and ideas or we silence you”.


The irritating part of encryption is that it's hard to determine what the underlying data is. With one key, it could be a word document that says "paying my taxes is the one joy i have in life" and with another, it could be the most horrifying image you can imagine.

I understand the government's interests in this particular issue, people that abuse children are the biggest pieces of shit that I can imagine, but unfortunately math is a pain in the ass and their laws are not possible to administer.

If you think someone is abusing children, you can always send a cop to their house and have them check. That is well within the rights of the government, and I'd even go so far as to say I support that right.


Not difficult, just limited in power. Any time you find decrypted material on a device, figure out where it was stored, and send that company a penalty notice for failing to report it.


As far as I know, that's not a federal law. I can walk past someone violating every human right and constitutional amendment in broad daylight and it's my right to tell nobody about it ever. Apple has the same right, it seems. Amend the Constitution if you don't like it.


Uh, I wasn't saying it was. I was sketching out how you would implement such a law without difficulty. The "you" is the FBI. This is all a hypothetical, hence my use of the subjunctive throughout.


I don't believe that's a constitutional right at all.

Consider "mandatory reporters". People like schoolteachers are legally required to report certain things like parental abuse or neglect of their students. If the government wanted to write a law that said you are required to report a crime if you witness one, I'm not sure that would be unconstitutional.


I don't think they are legally required to do anything. Them having a license to work in their field is dependent on it, though.

As an aside, if you are legally required to report something you have seen, and do not, then you are violating the law, which means legally requiring you to report yourself is unconstitutional under the 5th amendment, so we enter into paradox territory...


Yeah, I think that if someone lost their medical license for, say, endorsing a particular candidate up for election, that would be an EZ 9-0 supreme court victory for them.

As they said on slashdot 100 years ago, child porn is the root password to the constitution. Configure sshd to only accept keys!


I can’t speak for the USA but in the UK, professions are required to act in certain cases where a layperson isn’t.

For example, a layperson can ignore someone having a heart attack but a doctor is required to help (even if it’s just calling for an ambulance).


I must be missing something. The government compels people to speak all the time. In courtrooms every day, witnesses are brought to testify in a manner they can only refuse if it criminally implicates them. A witness simply not wanting to answer a question can and will be held in contempt of court and subject to imprisonment.


From [0]:

...though all compelled speech derives from the negative speech right, that right lends itself to two distinct models representing two distinct approaches to compelled speech: compelled speech production and compelled speech restriction.

A. Compelled Speech Production

Intuitively, the right to free speech necessarily implicates the right to choose what not to say. The characteristic element of this negative speech right model is a compelled movement from silence to speech. A prohibition occurs as a function of the government regulation, but it is a prohibition on silence.

...The original compelled speech cases follow the speech production model of the negative speech right. West Virginia State Board of Education v. Barnette,22×22. 319 U.S. 624 (1943). the original compelled speech case, followed this model: school children had no capacity to opt out of reciting the Pledge of Allegiance and saluting the flag.23×23. Id. at 626. If they could, they would have remained silent at their desks. Instead, the West Virginia regulation required them to enter public discourse, to engage in speech where they otherwise would not have done so.

...It is the right to be able to say what one wishes to say and nothing else. But since every law implicates autonomy to some degree, the Court has been more lenient unless the infringement on speaker autonomy raises additional concerns under the circumstances. The government does have some capacity to compel production of speech expressing a particular viewpoint given its need to take positions on political issues.

...In other circumstances, though, compelled speech production need not trigger maximal constitutional suspicion if the law does not meaningfully infringe on speaker autonomy.

B. Compelled Speech Restriction

The second model of the negative speech right involves compelled speech that restricts speech. The amount of possible speech supported by any given speech medium is often limited. Forcing someone to speak thereby forces the speaker to occupy a portion of a limited speech medium with expression that she would not otherwise have engaged in. The result is that she no longer has the room to say what she otherwise would have used the limited speech medium to say.

...In Tornillo, the Court applied strict scrutiny and invalidated a Florida right-of-reply statute that required newspapers to publish the response of a public figure about whom the newspapers had previously published criticism.42×42. Tornillo, 418 U.S. at 244, 258. In so doing, the Court relied on the notion that the limited nature of the newspaper medium meant that newspapers could publish only so much speech.43×43. Id. at 258. By compelling some speech, the law stopped the newspapers from fully expressing what they wanted to say.

[0]: https://harvardlawreview.org/2020/05/two-models-of-the-right...


But they don’t have to compel code, they just have to create a law that says “you must regularly review images for CSAM”.


Or even just, "you are liable for any CSAM your users upload to your platform".


I was personally quite disappointed at the time, but I don’t care anymore. I completely stopped using iCloud at the time, and I’m not going back. Migrating off it was such a pita, and I don’t want to have to do it a second time.


>Apple is pretty good at fighting the US government

To me, thinking that any US based entity can fight the US government, seems a bit absurd.

They can regulate and inspect it into oblivion.


>that leaked letter that said

What letter is this in reference to? Not sure I remember this from 2021. Would love to read it if anyone has a link.



Yeah. This reminds me of how Airbnb has now rolled up "show total" pricing (inclusive of cleaning fees, tax, etc right before Biden mandated such practice into law.


> "screeching … minority"

Could someone put that on a shirt with a nice design and take my money? :D




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: