Chatbots by major influencers like Amouranth and Caryn Marjorie let fans pay by the minute for virtual interactions – here, an expert on parasocial relationships walks us through the risks
Twitter is awash with nerdy men who pay Elon Musk a monthly fee to amplify their “top tips” and “interesting hacks” for various AI tools – a new breed of influencer looking to capitalise on buzz about the emerging technology and get on its good side in time for the singularity. (Come to think of it, maybe it’s not such a bad idea.) Mostly, their posts just rehash previous lists of advice on how to maximise your workflow and 10x your productivity – advice for computer geeks, by computer geeks. However, there’s one bluecheck post that pops up again and again, advertising a different use for AI: conjuring up famous figures, living or dead, and forcing them to answer your inane questions.
With AI, you can make Albert Einstein talk to Aristotle! You can ask Shakespeare for his take on social media! You can listen to Kurt Cobain on the Joe Rogan podcast! You can... honestly, who cares? Obviously, none of these “people” are real; they can’t have original ideas, and for the most part the AI chatbots aren’t even very good at mimicking their speech patterns. Even Jarren Rocks, the creator of a proposed app called Seance AI, which aims to “resurrect” deceased family members for a final conversation with their loved ones, admits that waking the dead with AI doesn’t really work. Telling Futurism about the app last month, he said: “For short conversations, I think it feels decently human. I think it falls apart a little bit [when you] start to pick up on repetitions.” Like all current AI (as far as we know): “It’s following a pattern, it doesn’t really know exactly what’s going on.”
What about AI chatbots that aren’t based on the dead, though? What about the ones that are based on real, living human beings – human beings who can have input on how the app is made, and how it’s used? In many ways, these are a different story altogether.
Last month, the Snapchat influencer Caryn Marjorie – who has more than 2 million followers on the app – officially launched an AI-powered chatbot based on her own personality. Claiming to be the “first influencer transformed into AI”, she said that the chatbot was a way to connect with all of her subscribers on a personal level and ultimately “cure loneliness”. Part of this lofty ambition, she explained, involved working with the “world’s leading psychologists” to add elements of cognitive behavioural therapy and dialectic behavioural therapy into chats, specifically encouraging men to open up about their mental health.
The voice is is alarmingly good.
— Barsee 🐶 (@heyBarsee) April 19, 2023
AI Joe Rogan interviewing AI Obama, AI Trump, and AI Einstein in real time.
We are truly blurring the lines between reality and magic 👇 pic.twitter.com/5ZW0V5zOKn
The chatbot itself was developed by the AI company Forever Voices, which promises to let you engage in a dialogue with the “most iconic celebrities, creators, and influencers of our time”. In Marjorie’s case, this involved training GPT-4 on content from her now-archived YouTube channel. On the CarynAI website, the company describes the chatbot as a “virtual girlfriend”, adding: “Caryn has unlimited time for you. Whether it’s late at night, or the start of your day, Caryn will be by your side.” All for just $1 per minute.
— Caryn Marjorie (@cutiecaryn) May 21, 2023
Given the “virtual girlfriend” label and the emphasis on late night chats, you might be surprised to learn that, shortly after release, Marjorie announced that she and her team were “working around the clock” to prevent the chatbot from generating sexually explicit content. While it was supposed to be “flirty and fun”, she told Insider, erotic chats were supposed to be off-limits: “The AI was not programmed to do this and has seemed to go rogue.”
Far from being discouraged by the issues, however, Forever Voices has since added another, even more notable influencer to its virtual roster, who goes by the name Amouranth. In case you’re unfamiliar, Amouranth is one of the most popular women streaming on Twitch, regularly cracking the male-dominated top 20 list. Famous for live streaming ASMR, she’s also reported to have earned tens of millions via OnlyFans, and – unlike CarynAI – erotic scenarios are seemingly a key focus of her chatbot. “I thrive on taking risks and pushing boundaries,” Amouranth said in her announcement. “Above all, I prioritise being there for my incredible audience. AI Amouranth is designed to satisfy the needs of every fan, ensuring an unforgettable and all-encompassing experience.”
“Designed to satisfy the needs of every fan” is just one of a few creepy turns-of-phrase associated with Forever Voices’ AI projects. In a May 19 interview with Bloomberg Technology, CEO John Meyer also claimed that the company’s ultimate aim is to “democratise access” access to an influencer, suggesting the complete abolition of personal privacy. Again, of course, this is a complete fantasy – fans aren’t actually talking to Amouranth, or Caryn Marjorie. But could the illusion of intimacy have some knock-on effects nevertheless?
First, we need to look at the “parasocial” relationship that Amouranth, Caryn, or any other influencer, shares with their legions of fans. “A parasocial relationship is simply one that exists for only one person – it is not (or hardly at all) reciprocated by the other,” explains Dr David Giles, who specialises in media psychology at the University of Winchester. “Typically these are between media figures and members of the audience. The media user knows the media figure intimately, but s/he doesn’t exist for them (other than as part of a homogeneous ‘audience’).”
To some extent, social media has complicated this definition, since audiences have more access to media figures, and can talk back to them by leaving Instagram comments or typing in a Twitch chat. “I’ve always argued that we should understand relationships as existing on a spectrum, in which ‘social’ and ‘parasocial’ are the endpoints,” Giles adds. “So a relationship can be ‘partly parasocial’ – like many with vloggers, influencers etc. Fully parasocial would be something like a relationship with a fictional figure (who has never existed) or a dead human (like Elvis).”
These “partly parasocial” dynamics are controversial. While they have been linked to helping people form and develop their own identity, they have also been shown to drive negative traits such as materialism, and “parasocial breakups” can cause lasting emotional damage. In some cases, the illusion of intimacy or over-identification might also prove dangerous for the influencer, encouraging fans to break personal boundaries.
ignorance is not an option #CarynAIpic.twitter.com/OXucdC1n29
— Caryn Marjorie (@cutiecaryn) May 18, 2023
Giles can see why Amouranth launched a chatbot: “Maybe she thinks it will satisfy some of the more intrusive fans from interfering with her.” (Of course, it also adds yet another revenue stream to an influencer’s media empire. As Caryn herself says: “The money is great, there’s no denying that.”) In the long run, though, he suspects that AI-powered chatbots “might just makes things worse” for influencers, explaining: “Potentially it could be seen as flirtation... feeding desire.”
Is he saying that fans’ parasocial relationships grow stronger via this virtual flirtation, making them even more likely to track down the real human beings the bots are based on, and interfere with their lives? Yes, says Giles – it’s a “real risk” – but only because AI chatbots lack an essential level of humanity. “[People] won’t be fobbed off with a bot for long if it is simply a virtual representative of the living human they were interested in to begin with.”
The risks are amplified, as Caryn warns in a video posted to Twitter following her chatbot’s debut, by “a lack of systems, regulations, rules and ethics” surrounding the new technology. “Be extremely cautious about the companies you choose to work with,” she tells other influencers looking to turn themselves into chatbots. “Because they will own your voice, your personality, and your identity... Remember that when it comes to AI, you’re playing with fire.”
“[People] won’t be fobbed off with a bot for long if it is simply a virtual representative of the living human they were interested in to begin with” – David Giles
A potential solution to the sketchy ethics of “virtual girlfriends”, says Giles, might be to move away from real, living figures, and promote realistic chatbots based on fictional figures or dead celebrities like Marilyn Monroe (although that comes with its own “interesting ethical issues”). Ultimately, though, he doesn’t think that the revolutionary claims of AI chatbot companies like Forever Voices hold too much weight anyway. Social media platforms have already democratised access to influencers, he notes, and the human imagination is sufficient to sustain even our most messed up parasocial behaviour. “I can make up any lurid sex fantasy I like involving an influencer,” he says. “I don’t need a bot to do it for me!”
OK... so is Giles saying that AI-powered “virtual girlfriends” – and the way they’re exploited to output sexual content – is just an extension of the dehumanising attitudes and entitlement that many fans already feel toward the influencers they follow? Well, he doesn’t think it’s dehumanising to begin with. “Anybody who can access visual images of another person is entitled to fantasies how they like about them,” he says. Going further, he adds: “I don’t see the point of a virtual girlfriend if the content isn’t explicit. Who on earth drew up those guidelines? What on earth did they expect users to do with the bot?”
This raises its own questions about the relationship between fans and online influencers (particularly women – notice that AI companies aren’t blasting resources into “virtual boyfriends” based on male internet personalities). Should we really expect fans to manipulate simulated influencers into sexual scenarios at the first possible opportunity, even when the real people behind them have explicitly forbidden it? And even if fans are technically “entitled” to them, should their fantasies about real women be indulged by tech companies for profit?
These questions are only going to grow in scale and scope. “You wonder how many parasocial relationships can be meaningful to a single person,” says Giles. “Is there a saturation point? It doesn’t seem to be the case so far – the more media we have, the more celebrities, the more fans, the more haters, the more PSRs.” Combined with the fragmentation of media into more niche, intimate communities, and the introduction of technologies such as deepfakes and voice generators, this points toward an increasingly profitable (and problematic) market for AI influencer clones in years to come, with fans offered more opportunities to bend influencers’ avatars to their will. It’s a bleak new world that has such virtual girlfriends in it.
Join Dazed Club and be part of our world! You get exclusive access to events, parties, festivals and our editors, as well as a free subscription to Dazed for a year. Join for £5/month today.