The scary thing about this is that many people have images that look like child porn at home: Family photos of naked grandchildren playing at the sea or in the mud, as well as sexting between consenting teenagers. There is no way automatic classification can distinguish these from real child porn, because they look almost identical and thus many private pictures will be shared with the government. This not only infringes the privacy of those people in the most drastic way possible, but might also lead to lawsuits that can tear families apart (think your father being accused of having CP only for it to turn out months later it was an old picture of yourself). The article mentions that 40% of CP investigations in Germany are started against minors (implied: through sexting), which shows that this concern is not purely theoretical at all.
This was not an issue with the old approach of comparing checksums against public CP images, but it will increasingly be if we use more AI algorithms for this.
EDIT: Probably Apples neural hashes that are currently in the news won't be vulnerable to this since they trained specifically to only detect changes in color/cropping/rotation. But we can't know for sure since the white paper is not that detailed and "naked kid on beach" photos all look really similar usually.
"Child porn" is a smokescreen. It's what every privacy advocate saw coming: find the worst, most repugnant thing possible and use it to backdoor encryption.
It's not a dream, and it's really happening, right now.
Yeah. Children are the perfect political weapon. You can use children to justify anything and destroy the reputation of anyone who opposes you. It's reasonable to assume anyone using children as argument is acting in bad faith.
I seriously doubt governments actually care about children. They probably care a great deal more about maintaining their own power and control over their subjects. We see countries like Russia and China using the tools of surveillance and law enforcement to silence political opposition and dissent.
Yes, given all the burgeoning authoritarian states where phones are sold it's clearly population control.
Regardless of country, scanning for material critical of PRC, Lukashenko or Erdoğan is the exact same tech you might want to scan for copyright violations, union organizing, political party opposition, cryptocurrency, or any other dissent your local burgeoning authoritarian wants to pay for.
Both phone OS vendors have over and over proven eager to satisfy every authoritarian, no matter the human impact. We're screwed.
CSAM is still problematic, though. The consensus is that creates a market that incentivizes the abuse of children, because you cannot produce CSAM without CSA. It seems to be the one of the few classes of data that is an exception to the standard of privacy that is applied to almost all other data, to the point that Apple took action against it. Dozens of other countries besides the US also agree, and outlaw that class of data as well. Finding actual CSAM (and not just any kind of pornography involving minors, hence the shift in terminology) is supposedly directly tied to finding child abusers, thus preventing future child abuse from taking place.
There appear to be few to no studies that give statistical evidence for the market hypothesis or that the spread of CSAM causes CSA. But even if they existed, I still think the argument would ultimately become "letting one more child abuser get away because we preserved our privacy instead." How can such an argument be challenged, given that the prevention of CSA is a legitimate problem on a global scale? Even so much as mentioning an argument that takes into consideration the nature of CSA appears to be taboo - and for good reason, as it carries a risk of being labeled many kinds of terrible things oneself. That seems to be why a lot of threads here are reducing the issue to "think of the children" or lambasting the potential slippery slope of authoritarian surveillance, instead of discussing why CSAM is outlawed to begin with, and thus the reasons why Apple came to this decision in the first place.
Another thing that nobody seems to talk about is that Apple doesn't want to be held liable for CSAM stored on their servers, either. Within that context, this change is Apple's way of addressing that issue, which also happens to erode individual privacy. Apple's values appear to dictate that the tradeoff is worth it in the end. About the only people that have pushed back on this change come from technology or privacy-conscious circles. Nobody else seems to care. That is the status quo, and I'm not sure how it's going to change with general public sentiment the way it is surrounding CSA.
> Finding actual CSAM (and not just any kind of pornography involving minors, hence the shift in terminology) is supposedly directly tied to finding child abusers, thus preventing future child abuse from taking place.
NCMEC and similar groups call all pornography involving minors CSAM.[1] And this system can't detect new CSAM.
> Another thing that nobody seems to talk about is that Apple doesn't want to be held liable for CSAM stored on their servers, either.
Many people said they would prefer server side scanning. Other people argued E2E encryption would shield Apple from liability.
Since the system can't detect new CSAM, doesn't it actually incentivize the production of new material? This mechanism could then actually increase the amount of abuse going on.
That's just one more argument to the fire: the intent here is not to protect children. The intent is to scan everybody for things "to be determined", and not subject to public scrutiny.
Europe is not immune to this. Jose Manuel Barrosso, the ex-EU-commission-president used to be a member of a communist revolutionary student group ... that has murdered people. He was present at at least three such killings, and it is not clear if "present" is where it stops, but obviously he was never convicted of murder (he was, of other things). When he came to office, there was a witch hunt for material and articles telling this story. And obviously plenty of lower ranking officials wanted to take care of some of their own dirty laundry. This is the real source of the "right to forget" privacy legislation. Makes you all warm and fuzzy inside, doesn't it.
EVEN when it comes to child abuse, the top politicians in Europe just brazenly ignore the rules. Emmanuel Macron's wife ... has publicly confessed to being a pedophile, and having sexually abused at least one child (yes, mr. Macron: she was his French teacher. She blames him, incidentally, as if that matters). Macron has publicly confirmed this (and taken "the blame", which of course doesn't matter: he was a minor). For anyone in France, it is clear: this is a despicable crime, and she would be harshly punished ... but any action against him might bring Le Pen to power, or even tip the balance in the EU parliament to anti-EU parties if we are really unlucky. I bet he has said as much to the public prosecutor.
A child defending a pedophile happens often, by the way, especially to stay out of the hands of child services, and is of course never accepted as an excuse ... except ... when it applies to even pretty low-level government officials.
I was gonna write a whole thing, but mostly I just wanted to say we'll be able to deepfake CSAM inside of 10 years, no question, so this whole thing is moot.
I think this is a massively important point that needs further consideration.
If a person can generate images on demand any system based on recognising known images is immediately bypassed.
AI image classification will be your only option but the false positive volumes have the potential to be massive, swamping any follow up investigation.
> The article mentions that 40% of CP investigations in Germany are started against minors (implied: through sexting), which shows that this concern is not purely theoretical at all.
I've read news about this. Teenager sends nude photos to her boyfriend, calls the cops after the break up and they proceed to nearly ruin his life by threatening lifetime sex offender registration. The implication that the girl produced and distributed child pornography of herself is never even mentioned.
Not only that, but child porn seems to mean a different thing in every place. The Netherlands and other European countries define the age of consent at 16, meaning some things that are perfectly legal there will be a ticket to prison in many countries. Now you don't just have an engineering problem, you will also have to spend a bunch of money to have some lawyers review your spec.
I believe porn involving 16 year olds is still legal in the Netherlands. Which doesn't mean it is being commercially produced anymore, for obvious reasons. Maybe a Dutch reader can provide more insight into this.
It even includes people that "appear to be under 18" even if they turn out not to be. ("[...] afbeelding – van een seksuele gedraging, waarbij iemand die kennelijk de leeftijd van achttien jaar nog niet heeft bereikt, is betrokken of schijnbaar is betrokken, [...]")
What I also found interesting is:
> Het voorontwerp Wet seksuele misdrijven stelt niet langer strafbaar degene die een visuele weergave van een seksuele gedraging of een gegevensdrager bevattende een visuele weergave van een seksuele gedraging, waarbij hijzelf of een andere persoon die de leeftijd van achttien jaren nog niet heeft bereikt is betrokken, in het kader van een gelijkwaardige situatie tussen leeftijdsgenoten uitsluitend voor privégebruik vervaardigt, in bezit heeft of met die ander deelt.
Short translation: sending or owning pictures of sexual acts involving an <18yo is not illegal if it concerns private use between persons of similar age in an 'equality situation' (I interpret this as meaning/implying 'consensual').
The problem is when you analyze what this means in practice. In Belgium consensual sex is legal IF
1) both parties are < 18, and the age difference is less than 5 years OR one party is > 18 and the age difference is less than 2 years
2) both parties are both older than 14 years (used to be 16)
3) there is no "power relationship" between them (this requirement actually drops when one, not both, turns 18)
Unfortunately ... now apply these rules to a sexual relationship between a 14 and a 17 year old.
a) 1st year: legal, any images are not CSAM
b) 2nd-5th year: illegal, any images are CSAM
c) 5th year onwards: legal, images not CSAM
Law sucks. It used to be worse, this is an actual improvement over the previous situation, but ...
What is absolutely not clear to me: does (a) mean sexual images of a 14 year old are now legal in Belgium in that case? The law does not seem to require that the owner of the pictures has to be one of the participants ... but I find it hard to believe this is the actual intent.
And to make matters even more bad, this is not the only way to punish kids. You see while this will prevent criminal prosecution, you DO NOT need a criminal conviction to lock up a minor (and they're trying to extend this). Youth services can and does, without any proof (and in practice by getting 1 social worker to say something like "it is an unhealthy relationship". They can shop around until they find one, btw, and often the one they find has never seen either kid) and in the above example lock up both partners, the younger for up to 7 years and the older up to 4. In practice they will punish the younger kid, almost always the girl.
The fun thing is when a kid is locked up for criminal reasons (bad enough so that he actually goes to prison) schooling CANNOT (and is not in practice) denied to the kid. When youth services locks kids up, they can (and are in practice) denied schooling. But of course the previous point means the police will try to use child services, not criminal prosecutions, if at all possible.
So sadly, if you want a kid's situation to improve over time and they do not want to end the relationship, your course of action is clear: you should let the abuse continue and even (help) hide it. However, if you want to hurt the kid(s) (whether or not such a relationship is actually abusive), you should report them. So here too the net result of the law is: very easy to "abuse" the law to hurt children (esp. if you are a social worker), very hard to use the law to protect children against abuse (or actually help them)
The age of consent goes low as 14 in Europe but that doesn't change the definition of CP. Probably they won't prosecute when both parties are underage and such so heavily but consent age doesn't change the fact that nudity is 18+.
Nudity is 18+ in extremist countries like the Emirates or the USA.
Sorry but in France, 30 years ago before american culture started to dictate where money come from, you would see genitals from all genders and ages in movies.
Nobody found it offensive, because we didn't have a catholic puritan morality to impose its sick twisted vision of the human body on us.
Meanwhile we sold car by showing cars, not hot girls. But this apparently is ok because money.
It's also perfectly alright to show kids image of people killing each others in mass. Because this is sane, unlike nipples.
Indeed. But it strikes me that the places with the most rigorous enforcement of theological dominance also have the worst child abuse problems. Or, as an AI would decide, men in robes can't be trusted.
Well if you tell people they are sinners for being humans (having body parts), you get frustrated humans.
I have been in very rigid religous countries. The men in the internet shops were all whatching hardcore porn, and my female friends were harassed in the street.
So it's not men in robes, you can be deeply religious and have a and loving view of the world.
But puritanism, and sexually frustrated people ashamed of even existing, are bound to generate unhappiness.
> the places with the most rigorous enforcement of theological dominance also have the worst child abuse problems.
Citation needed. The data doesn't back up that claim, contrary to the media's narrative.
A report which Christian Ministry Resources (CMR) released in 2002 stated that contrary to popular opinion, there are more allegations of child sexual abuse in Protestant congregations than there are in Catholic ones, and that sexual violence is most often committed by volunteers rather than by priests.
Catholic clergy aren't more likely to abuse children than other clergy or men in general. The 4 percent figure appears lower than school teachers during the same time frame, and certainly less than offenders in the general population of men.
You mean like in movies, those "can't see boobies before you're 18" labels? Because capturing a nude 13-year-olds is perfectly legal, just not in a pornographic manner. E.g. my Dutch biology book had nude ~6 and ~11 year old girls depicted (as well as an adult) for educating different stages of development.
Not a lawyer, but my understanding is that in the States, sending naked pictures of teens is illegal, even if it's a picture of yourself. If you send naked selfies to someone, you could be charged with distribution of child pornography if you are a teenager.
I'm also fairly sure I heard that even possessing naked selfies is considered illegal, at least in some states, e.g. this article about a teenager facing 10 years in prison for having pictures of himself naked at 16 years old on his phone:
https://www.rollingstone.com/culture/culture-news/teenager-p...
> A Fayetteville, North Carolina teenager has reached a plea deal to avoid being charged with multiple sexual exploitation counts after his cell phone was found to contain nude selfies of himself. Seventeen-year-old Cormega Copening, who took the photos of himself when he was 16, agreed to the deal in order to avoid possible jail time and being registered as a sex offender. As part of the plea, the teen agreed to random police searches without warrant for one year as well as other penalties, Fusion reports. The teenager was listed as both the victim and the perpetrator on the sexual exploitation charges.
This is quite possibly the most kafkaesque "justice" story I have ever heard coming out of America and that really says something.
The law and morality are often fundamentally opposed. Saudi Arabia punishes homosexuality with the death penalty, and Apple happily removed the encrypted communications from their phones so they could continue selling in that country.
Illegal or not, comparing two teens that sent naked selfies to each other is not even remotely similar to involuntary child abuse from an adult.
>this article about a teenager facing 10 years in prison for having pictures of himself naked at 16 years old on his phone
That anyone would even consider spending taxpayer money to lock up a 16 year old for his own picture shows a colossal failure of government on every level Legislative, judicial, and executive. The DA stacked 5 charges and saddled them with up to a decade so the kids would be afraid to even go to court. In any sane judicial system that case would be laughed out of the courtroom and it's telling that they were presumably advised by a lawyer to take the plea bargain.
I mean technically that already happens in the UK - you can have sex legally at 16, but you can't take any pictures of you doing so until you are 18.
So 2 consenting 17 year old teenagers cannot send indecent images of themselves to each other.
I seem to remember a case from a few years back, about a girl who got herself on the register for sending her boyfriend a picture of her naked or something.
Although it seems that the Protection of Children Act [1] I think (I'm not sure, it's hard to actually read with all the brackets ...) let's you do it if you are married (which you can also do at 16)
Any sane person would know that teenagers do what teenagers do ... having someone potentially going through those photos is such a strange and wrong approach
It might also turn into a good teachable moment for teens about privacy, sharing, and technology. I don’t take a picture on a smartphone of anything that I wouldn’t want to make public, and I surely wouldn’t share or post it anywhere. If it leaves your device, it’s effectively public, regardless of whether whoever you sent it to tells you they’ll keep it “private”.
Even this is naïve with Apple's new plan. How long do you really think they'll wait before a software update makes it apply to even things that don't leave your device?
This would be a much bigger step though, and any brand doing it, even Apple, would face very negative market effect.
It couldn't be legislated in the US because it would be unconstitutional to do so, almost impossible in the EU, possible in the UK/AU, even Canada (with the help of some creative rights busting from the Supreme Court, as they recently demonstrated).
AFAIK, ML image classification is used for checking user-generated content on social networks, forums, etc. Cloud storage, email and IM providers will continue using hashes (either exact or perceptual) to lower false positives.
In Apple's defense here, there seems to be a reason they're using the longer acronym CSAM. It means not just pictures of people without clothes on, but actual images/video of children being abused. Stuff that would likely provoke you or me or other normal people into acts of extreme violence against the perpetrators if we ever got the chance.
The real danger with Apple's system is that there's zero accountability in the list of "bad" hashes. So there's no way to know whether it's really all CSAM, or if it also includes rare Pepes, Bernie memes, or pictures of politicians doing embarrassing things.
> It means not just pictures of people without clothes on, but actual images/video of children being abused.
Who said that?
They're the same legally. People have said NCMEC's database includes entirely legal images. And groups like NCMEC have tried to rename child pornography for years.
But the point is there’s an existing database. Either you have a photo in that database, or you don’t. So your kids naked photos don’t get triggered. And even if that photo is legal, not only is there human verification and minimum quantities required, it would be very strange for you to have in your iCloud photos that were part of CP collection sets regardless of content.
Their point was CSAM means especially bad CP. Your points were different.
It's a perceptual hash. Matches don't have to be exact. And people have engineered collisions for other perceptual hash algorithms. What a computer sees and what a person sees can be very different.
People have said the human verification just involves the visual derivative of the suspect image. Apple didn't explain what that is really. And human verification doesn't reassure people who object to anyone viewing their private photos without their consent for any reason.
People have said any images collected during an investigation go in the database. Do you think people who collect explicit photos of 17 year olds don't collect explicit photos of 18 year olds?
Sorry. Either the posts got updated or I got the thread confused as others were trying to say that (re: hash versus classifier).
Re hashing, Apple claims 1:1 trillion likelihood of a collision. These kinds of systems are not rolled out lightly, and even if that number is wrong, it’s feels unlikely to me that it’s too far off. If it is and it has too many false positives, this will get noticed and the system pulled until it’s fixed and at the required false positive rate.
Ultimately beyond Apple if you’re getting arrested and confront a judge, you’d expect humans at that point to look at the evidence. In fact, I’d expect the DA or whomever to similarly look at the photos at that point. You can’t be sentenced without evidence in the US legally (how all of this works in another country is another matter).
If there is legit porn that’s 18+ mixed in this database, and someone ends up being charged because of it, fights, and wins, I’d expect a number of counter lawsuits to follow. To me it seems incredibly unlikely that non-CP is not only going to be a significant part of that database (including 17 year olds versus the more likely 7 year olds), but you’ll be saving it to your iCloud photo roll. There’s so much legal porn out there, in such vast quantities, this hypothetical situation you describe I’m not sure will ever actually occur.
Apple's claim is unverifiable. It counts for nothing. And people with relevant experience have called it bullshit.[1]
Even just arresting someone means separating them from their children. Preventing them from working if their job involves children. Seizing all their electronics for months. Violating the privacy of their files and belongings. Legal costs. Possibly media reports. Putting innocent people through this because a secret algorithm said a secret database matched a secret number of times is unacceptable.
Several people have claimed false positives are in the database. Including someone who verifiably worked with it.[1]
US prosecutors have absolute immunity. Any civil suit would be dismissed swiftly.
People don't collect random subsets of all pornography ever made. Some is much more popular. People have specific tastes. Photo sets exist.
> 40% of CP investigations in Germany are started against minors
This in particular seems like such a silly outcome of over-regulation and bureaucracy. Amazing. Research suggests most child sexual abuse does not have much of a long-term effect at all:
This (of course!) does not mean it is good or should be done, but it suggests this is not an urgent issue, certainly not one that can justify abolishing privacy, especially when these measures only catch the small fish, not the large trafficking rings who will probably switch to different means of communication once theirs gets compromised.
You're arguing that child sexual abuse is "no that bad" by citing a guy whose "research papers" include gems like
* The left-liberal skew of Western media
* What Happened to Brussels? The Big Decline and Muslim Immigration
* Mental illness and the left
* Human Biodiversity for Beginners: A Review of Charles Murray's Human Diversity
* Race Differences: A Very Brief Review
* Racial and ethnic group differences in the heritability of intelligence: A systematic review and meta-analysis
* Global Ancestry and Cognitive Ability
* Sex Distribution, Life Expectancy and Educational Attainment of Comedians
* Immigrant crime in Germany 2012-2015
* Country of origin and use of social benefits: A large, preregistered study of stereotype accuracy in Denmark
* Inequality in the United States: Ethnicity, Racial Admixtureand Environmental Causes
* Increasing inequality in general intelligence and socioeconomic status as a result of immigration in Denmark 1980-2014
* Criminality and fertility among Danish immigrant populations
He boasts 24 publications in Mankind Quarterly and 20 in OpenPsych, both of which he seems to run himself. Mankind Quarterly according to Wikipedia 'has been described as a "cornerstone of the scientific racism establishment", a "white supremacist journal", an "infamous racist journal", and "scientific racism's keepers of the flame"'.
There are excellent cases to be made why privacy and encryption should not be compromised in the name of hot button issues like "protecting children" but citing a study by a "scientific racist" and eugenicist, who is a known advocate for legalizing child pornography, to trivialize child sexual abuse is not it.
You don't need to be a "left liberal" not to cite Emil Kirkegaard. Being a decent human being or having any appreciation of actual science would suffice.
This was not an issue with the old approach of comparing checksums against public CP images, but it will increasingly be if we use more AI algorithms for this.
EDIT: Probably Apples neural hashes that are currently in the news won't be vulnerable to this since they trained specifically to only detect changes in color/cropping/rotation. But we can't know for sure since the white paper is not that detailed and "naked kid on beach" photos all look really similar usually.