
In an article I wrote for Magia magazine, I tackled a challenging topic: the dark world of sexual deepfakes, a much more serious threat than we probably imagine.
I was inspired by the article “Would love to see her faked’: the dark world of sexual deepfakes – and the women fighting back,” by Shanti Das, published in The Guardian on January 12, 2025, which offers a chilling picture of this phenomenon in England.
In the past, I’ve covered so-called romance scams, which are carried out by actual criminal organizations, which now have the tools, thanks to artificial intelligence, to be more convincing in deceiving unsuspecting people.
In contrast, the phenomenon of sexual or pornographic deepfakes, described in the aforementioned article, most often involves people we consider “normal” as perpetrators of another type of crime.
In late 2017, when deepfakes began to spread, the term was primarily associated with the world of porn, involving the superimposition of famous people’s faces onto the bodies of actors in adult films. In recent years, it has generally referred to manipulated photos and videos, often used in disinformation campaigns. However, even today, most deepfakes are pornographic in nature and produced in pathological contexts.
The case recounted by the Guardian journalist is that of a woman who reported a man, previously believed to be a friend, who shared fully clothed photos of her from her private Instagram, inviting other members of a forum to edit them, creating sexually explicit deepfakes.
Fortunately, there are activist groups that support victims in reporting such services to Apple, for example, for their removal. Most, but not all, of those affected are women, as is the case with dating scams. Approximately 72% of deepfake cases detected by the charity helpline discussed in the article involve women.
For more information, see the article.
