The harms associated with someone creating a deep fake of you are real but they're pretty insignificant compared to the harms associated with being sex trafficked or being exposed to an STI or being unable to find traditional employment after working in the industry.
Think about the change we saw in combat death tolls when things went from flintlock muskets to machine guns, or when battleships gave way to aircraft, and how many people died unnecessarily due to the generals who were slow to update their tactics. Deepfakes are like that because they lower the cost and improve the success rates enough to be transformative and they cause harm which can’t easily be countered. We’re not going to instantly train society to be better at media literacy and the police can’t just ignore reports of sex crimes, so we’re just having to accept that it’s easier to hurt people than it used to be.
No? And I didn't suggest deepfakes should be legal.
I was just pointing out that when you're talking about the scale of harm caused by the existing sex industry compared to the scale of harm caused by AI generated pornographic imagery, one far outweighs the other.
That's not really true. Look at one if the more common uses for AI porn: taking a photo of someone and making them nude.
Deepfake porn exists and it does harm