Child pornography is one of the worst forms of human depravity, because in order to satisfy one person's messed up desires, a child is subjected to unspeakable sexual violence. We're conditioned to protect our young for lots of obvious reasons, so this triggers an understandable an entirely justified visceral response.
This particular situation is ... different. It's clearly still causing pain to children. It's using their likeness without their consent and in a sexually violent way.
But ... I can't get behind the idea of equating it to child pornography.
It should absolutely be considered a crime, and come with its own set of punishments for those found guilty.
Again, making it absolutely clear that I personally find this act to be vile, unacceptable and highly antisocial, I also think that it should be published much less severely than producing/distributing .. err ... "actual" child pornography...?
We treat manslaughter and murder as different things, perhaps that's a suitable analogy here?
This also seems similar to the whole issue of deepfaked porn involving celebrities. When folks said "AI is gonna usher in societal problems we have no idea how to deal with", I never imagined it would get this bad, this quickly.
It depends on whether the AI model was trained on CSAM or not, right?
If it was, then crime. If it wasn’t then no child was harmed and in a free thinking liberal society we don’t punish thought crimes.
And if AI models prevent people from committing actual harm to children, then isn’t this actually a win?
Humans and machines must be free to imagine. And as a society we must tolerate all art, even if it depicts something most people find gross. Consider, we have books, movies, and video games depicting killing, even though it’s illegal.
I'm a fan of not yucking other people's yum, especially in the privacy of their own homes and not infringing on the liberty, safety or wellbeing of others.
That said, if we've got folks who just straight up like child pornography (which we do, and always will for as long as we remain a race of homo sapiens, sadly), would the ability for them to consume this kind of generated content actually help? Or would it encourage these people to then go further and prey on real human children?
I simply have no idea. I grew up with violent video games. I've had violent thoughts. But blowing someone's brains out in a game has never motivated me to do it in real life. I think that whole era of moral panic was silly. But human psychology is complicated, this could be very different.
Well... it's certainly interesting to ponder. As a personal anecdote I had zero parental supervision growing up and I spent an absurd amount of my formative years on the dark corners of the early Internet. During that time I got hooked on pornography the effects of which I deal with to this day. If I replay the scenario but insert the possibility of stumbling across what amounts to a pedophilia creation machine I... really don't want to think about it...
Creating pornography featuring the likeness of anyone, child or adult, should automatically be classified as a crime similar to revenge pornography laws.
Creating child pornography that does not feature the likeness of someone living or dead should be prosecuted under obscenity laws, but not as child abuse, since by definition no children were abused.
This particular situation is ... different. It's clearly still causing pain to children. It's using their likeness without their consent and in a sexually violent way.
But ... I can't get behind the idea of equating it to child pornography.
It should absolutely be considered a crime, and come with its own set of punishments for those found guilty.
Again, making it absolutely clear that I personally find this act to be vile, unacceptable and highly antisocial, I also think that it should be published much less severely than producing/distributing .. err ... "actual" child pornography...?
We treat manslaughter and murder as different things, perhaps that's a suitable analogy here?
This also seems similar to the whole issue of deepfaked porn involving celebrities. When folks said "AI is gonna usher in societal problems we have no idea how to deal with", I never imagined it would get this bad, this quickly.