Whoever is generating and distributing these images should be charged with distributing child pornography. Doesn't matter if the images are AI generated.
That's my initial response, anyway. I'll concede that it's not well thought out. For one, it isn't comprehensive enough to solve the problem of the perpetrator being halfway around the world.
I think the problem is that the most likely perpetrators are the teenage boys the girls are in school with. It seems like exactly the sort of thing teenage boys would do without fully understanding the damage never mind the legality. In that sense it feels very tricky to deal with - saying that, it needs a solution. Being a kid these days seems pretty awful. I thought it was bad when your mistakes could be caught on camera and put on Facebook. How much worse can it get?
I think the endgame is having a different understanding of privacy, modesty, etc. There's no way this is going to go away or be regulated away somehow. Heavy handed punishment of young kids who generate images just creates more problems (though I imagine we'll go through that phase). Eventually (in a generation or two) it will equilibrate and nobody will take the pictures seriously or be interested in making them. There's novelty now, it will go away.
I can't see any other realistic direction this will go.
I can’t see this happening. People have been saying this for a long time. On top of that, a lot of young girls are going to go through a lot of pain in the meantime - hoping for societal change seems negligent.
Yes. After all the alcohol limit of 21 is a massive success and leads to both people under 21 never drinking alcohol, and people over 21 being very responsible drinkers, enjoying one or two drinks in the evening instead of getting blackout drunk.
In light of these successes we should ban smart phones from under 16 year olds. Computers too, after all those can also be used to access AI tools. Anyone who says that adults taking the bare minimum of responsibility has anything to do with parenting instead of taking agency and responsibility from teenagers is just a small-government naysayer
I realize that users of this site might be inclined to think that non-perfect solutions are not acceptable. However, in the real world, all solutions are non-perfect. Like for instance alcohol and tobacco limits: they are a success, even if they don't totally prevent children from consuming those drugs.
A smart device ban would be similar to the ban of those substances. Not terrible, not great, but much better than status quo.
That opens the whole "why is child pornography illegal" question we already have with the legal status of drawn depictions of child pornography.
The most obvious answer is that it's about the harm to the subject of the pictures. Most child pornography is made through exploitation of minors, so we just forbid the whole category. Fictional child pornography, like a drawing (or an AI generated image) doesn't suffer from that, so doesn't have to be outlawed. That's largely the position of the US justice system for example.
Some countries go further, arguing about the impact of child pornography on society, especially pedophiles. Pedophilia seems to be getting worse by consuming child pornography, not better, so that gives reason to outlaw it altogether, no matter how clearly fictional it is. That also gives room for lots of subtlety, like when a Swedish court ruled that a manga expert could keep a drawing that would in other cases be illegal child pornography. Similarly the fact that the case in this article is child pornography made by minors for minors could factor in.
In Spain specifically, the line is drawn at a certain level of realism. Real porn of real children is illegal, so are things that look exactly like it, manga levels of unrealism are legal, but somewhere between there's the line. Where these AI images fall on that line would be interesting, but impossible to judge without seeing them and having good knowledge of the Spanish legal system.
Child pornography is one of the worst forms of human depravity, because in order to satisfy one person's messed up desires, a child is subjected to unspeakable sexual violence. We're conditioned to protect our young for lots of obvious reasons, so this triggers an understandable an entirely justified visceral response.
This particular situation is ... different. It's clearly still causing pain to children. It's using their likeness without their consent and in a sexually violent way.
But ... I can't get behind the idea of equating it to child pornography.
It should absolutely be considered a crime, and come with its own set of punishments for those found guilty.
Again, making it absolutely clear that I personally find this act to be vile, unacceptable and highly antisocial, I also think that it should be published much less severely than producing/distributing .. err ... "actual" child pornography...?
We treat manslaughter and murder as different things, perhaps that's a suitable analogy here?
This also seems similar to the whole issue of deepfaked porn involving celebrities. When folks said "AI is gonna usher in societal problems we have no idea how to deal with", I never imagined it would get this bad, this quickly.
It depends on whether the AI model was trained on CSAM or not, right?
If it was, then crime. If it wasn’t then no child was harmed and in a free thinking liberal society we don’t punish thought crimes.
And if AI models prevent people from committing actual harm to children, then isn’t this actually a win?
Humans and machines must be free to imagine. And as a society we must tolerate all art, even if it depicts something most people find gross. Consider, we have books, movies, and video games depicting killing, even though it’s illegal.
I'm a fan of not yucking other people's yum, especially in the privacy of their own homes and not infringing on the liberty, safety or wellbeing of others.
That said, if we've got folks who just straight up like child pornography (which we do, and always will for as long as we remain a race of homo sapiens, sadly), would the ability for them to consume this kind of generated content actually help? Or would it encourage these people to then go further and prey on real human children?
I simply have no idea. I grew up with violent video games. I've had violent thoughts. But blowing someone's brains out in a game has never motivated me to do it in real life. I think that whole era of moral panic was silly. But human psychology is complicated, this could be very different.
Well... it's certainly interesting to ponder. As a personal anecdote I had zero parental supervision growing up and I spent an absurd amount of my formative years on the dark corners of the early Internet. During that time I got hooked on pornography the effects of which I deal with to this day. If I replay the scenario but insert the possibility of stumbling across what amounts to a pedophilia creation machine I... really don't want to think about it...
Creating pornography featuring the likeness of anyone, child or adult, should automatically be classified as a crime similar to revenge pornography laws.
Creating child pornography that does not feature the likeness of someone living or dead should be prosecuted under obscenity laws, but not as child abuse, since by definition no children were abused.
It doesn’t take into account whether there’s a victim or not. In a free thinking liberal society we don’t punish thought crimes because the concept is absurd. It’s what allows us to have diversity of art, literature, thought, etc. Fantasy != reality.
Today we allow all manner of “unspeakable” acts to be portrayed and imagined: war, murder, sexual abuse, speeding, gambling, fraud, you name it we can write, draw, think, and talk about it. There’s nothing fundamentally special about portraying a minor in a lewd way in that sense.
So I think any call to heavily punish people for a new crime should be framed in the context of: who’s the victim and what harm are we preventing? If there is no victim then it’s much harder to build a case that there’s harm.
That's my initial response, anyway. I'll concede that it's not well thought out. For one, it isn't comprehensive enough to solve the problem of the perpetrator being halfway around the world.