The recent wave of AI-generated celebrity porn highlights the harmful role of text-to-image models in enabling image-based sexual abuse
For those who’ve been keeping track of the ever-advancing abuse of deepfake technology, the story sadly came as no surprise. The kinds of non-consensual, sexually-explicit deepfake images of Taylor Swift that went viral on X – circulating and multiplying on the platform, formerly known as Twitter, for almost a whole day – have existed online for years. In fact, Swift is one of the most popular targets for deepfake porn, and has been edited into hardcore videos since at least 2017.
Once in the public domain, though, the deepfakes spread like wildfire. As per The Verge, one of the most prolific posts, made by a now-suspended verified user, amassed more than 45 million views, 24,000 reposts, and hundreds of thousands of likes and bookmarks, and was live for 17 hours before it was removed. (Again, this isn’t particularly surprising, given Elon Musk gutted X’s moderation team when he took over the platform in 2022.) Swifties quickly sprung into action, flooding the TL with ‘Protect Taylor Swift’ tweets in order to bury the trending search, ‘Taylor Swift AI’. At one point, X blocked users from searching Swift’s name altogether.
Swift hasn’t commented on the images herself, but, testament to her and her fanbase’s colossal clout, the whole debacle has caught the attention of those who can actually do something about it (according to a report by Control AI, 75 per cent of American voters support criminal charges against people who shared AI deepfake sexual imagery of Taylor Swift). In the days following, a group of US senators introduced a bill that would criminalise the spread of non-consensual, sexualised deepfakes, while EU negotiators did the same (deepfake porn is already criminalised in the UK under the Online Safety Act). Musk, meanwhile, promised to hire 100 more moderators to tackle child abuse and other illegal posts.
For many people, this has opened their eyes to the reality and ubiquity of deepfake porn, which makes up 98 per cent of all deepfake videos online. They might have heard or even read about the phenomenon before, but never seen, as Henry Ajder, an AI and deepfake expert puts it, “this kind of content in the synthetic flesh”.
X takes action to combat deepfakes, blocking searches related to Taylor Swift.
— Regulating AI - Innovate Responsibly (@RegulatingAI) January 31, 2024
The move aims to protect against the misuse of AI-generated content. Swift's image rights become a focal point in the battle against digital manipulation.#deepfake#AI#RegulatingAI#AIcommunitypic.twitter.com/EDl7CsescG
And yet deepfake porn has long been thriving on certain corners of the internet – in Telegram channels (where, according to 404 Media, the Swift images originated), on 4chan, on dedicated sites like MrDeepFakes – targeting not just celebrities, but any woman who’s ever had her photo taken. With just one click on 4chan, for example, you can access a thread of people sharing photos or selfies of non-famous clothed women and asking someone called ‘Nigel’ to undress them. The recent rise of sophisticated text-to-image models – whose safe-for-work filters can easily be hacked – has only exacerbated the problem, making tools to generate nudes increasingly easy to access and use.
“When it comes to Telegram and 4chan, without governments cracking down in ways that would probably make people feel uncomfortable, it’s basically impossible to stop this content spreading there,” says Ajder. Similarly, legislation outlawing deepfake porn can only go so far. It’s no secret that the criminal justice system fails victims of sexual violence time and time again (in 2021, just 1.6 per cent of recorded rapes resulted in a charge or summons); combine that with the anonymity provided by the internet, and there’s very little chance of catching this kind of perpetrator, let alone prosecuting them.
Not just that, but there’s a risk that over-policing will lead to outright bans on sexual content, which, as we’ve seen on TikTok, Instagram, and Facebook (and in a proposed Oklahoma bill that would ban watching porn and sexting outside of marriage), can lead to repressive censorship, the erasure of valuable sex education, and financial discrimination – all of which inevitably hits sex workers the hardest.
“Without governments cracking down in ways that would probably make people feel uncomfortable, it’s basically impossible to stop this content spreading” – Henry Ajder
Besides, any legislation or moderation is just a pointless sticking plaster if we’re not going to address the root cause. Women are the targets of deepfake pornography 99 per cent of the time, while 76 per cent of the perpetrators of intimate image abuse more broadly are men. This is part of a broader culture of misogyny that seems to be on the rise among young men in particular – something that was highlighted last week by a study into the growing political and social divide between young men and women.
There are organisations, like The RAP Project, The Right Path, and Men At Work, hoping to rectify this by bringing workshops on consent, sexual offences, porn and media literacy, and, yes, Andrew Tate into schools across the UK. “The heteronormative gender stereotypes that a lot of the world has revolved around has, over the last couple of decades, been shaken at its core,” says Deana Puccio Ferraro, co-founder of The RAP Project. “When your perceived place in society is being threatened, you’re going to push back. And that’s what’s happening with young men.”
A huge 75% of American voters support criminal charges against people who shared AI deepfake sexual imagery of Taylor Swift.
— Control AI (@ai_ctrl) February 7, 2024
Internationally and bilaterally, people are opposed to deepfake tech. In the UK, 86% of the public support a ban on non-consensual deepfakes.
We're… https://t.co/W4nS52XqNFpic.twitter.com/n8KqoqlZMD
Joe Cole, one of the directors of The Right Path believes young mens’ feeling of disorientation could be remedied by creating an environment in which “boys’ voices can be heard and they can feel empowered to share their views, opinions, and beliefs without condemnation, but still in an environment where they can be challenged”. The Right Path strives to create this environment during their healthy relationships sessions (and beyond) by asking students: why? They find this to be a particularly helpful technique when it comes to sexually-explicit deepfakes and porn: ‘Why am I watching, making, or sharing this? Is it for humour, arousal, research, or information, or is it for revenge, spite, or blackmail?’. “We want to help young people understand their own behaviours and to think about the long-term consequences of their actions,” explains Cole. “We don’t teach them what to think, we teach them how to think.”
Neither The RAP Project nor The Right Path shies away from talking about deepfakes and porn. In fact, through their sessions, the latter has found that most boys are watching porn online from the ages of 10 or 11. “A lot of these boys are accessing porn at that age, but not having meaningful sexual relationships until they’re 16, 17, or 18,” says co-director Gareth Rogers. “That can damage their perceptions of what a healthy relationship looks like, what consensual sex looks like, and actually what sex itself is.”
“We want to help young people understand their own behaviours and to think about the long-term consequences of their actions” – Joe Cole, The Right Path
Watching porn – with all its big dicks and macho men – at a time when their own bodies are changing may also be destabilising. “At one level, they’re being told that they have to be hyper-masculine, competitive, aggressive, all this stuff [that porn portrays], but then if they behave that way, they’re reprimanded,” says Ferraro. “They’re confused and defensive.”
This desensitisation to porn may also play a role in the normalisation of deepfakes. Rogers believes that making deepfake porn of celebrities or classmates is just another extreme behaviour that’s been enabled by this unprecedented access and technological advancement. Put simply, in a patriarchal, misogynistic society, “it offers boys a new opportunity to dehumanise and degrade women,” as Cole puts it. “It’s essentially another way of calling a girl in their class a ‘slag’.”
Except it has much bigger ramifications for those targeted. “Deepfake porn is an incredibly dangerous and powerful tool for blackmail or coercion,” says Emma Pickering, the head of technology-facilitated abuse and economic empowerment at Refuge. “Perpetrators can blackmail victims into resuming a relationship, sending them explicit imagery, sending them money, taking out credit cards, loans, or debts.” Victims have also spoken about the trauma that comes with not only seeing yourself having sex online, but in an encounter that you have no recollection of.
This is where legislation could be helpful – in changing perceptions of deepfake porn among young men. “If you spend enough time on forums like 4chan, you see a real apathy and blasé attitude from a lot of people creating this content,” says Ajder. “Like, ‘This is clearly fake, you’re overreacting, it’s a bit of a laugh’. What legislation does is provide zero space for grey areas to emerge. If someone said, ‘Look, by sharing this content, you’re doing something that’s really unethical and I don’t think you should do it’, that’s not as persuasive as, ‘If you share this content, you’re legally defined as a sex offender’, which obviously has more weight.”
The depressing reality is that nobody can really do anything to stop the creation or sharing of deepfake porn – at least without going full Big Brother on the internet. Yes, we can make it harder through legislation, robust moderation, or through tools like data poisoning (which, Ajder explains, is when you “inject signals into your images which corrupt them if someone tries to use them in an AI tool”), but we should also strive to make it less desirable in the first place.
And yet, although this needs to happen in schools, the most resistance The Right Path faces is from teachers and parents themselves. “We live in a society in which adults openly objectify women and sexualise them for profit, and yet they feel too anxious and embarrassed to talk to young people about these issues,” concludes Cole. “If the environment is sick, it’s no wonder young people also have a sickness in this area.”