Josh Taylor
The Guardian
Originally posted 21 July 23
Here is an excerpt:
When you sign up for the Eva AI app, it prompts you to create the “perfect partner”, giving you options like “hot, funny, bold”, “shy, modest, considerate” or “smart, strict, rational”. It will also ask if you want to opt in to sending explicit messages and photos.
“Creating a perfect partner that you control and meets your every need is really frightening,” said Tara Hunter, the acting CEO for Full Stop Australia, which supports victims of domestic or family violence. “Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic.”
Dr Belinda Barnet, a senior lecturer in media at Swinburne University, said the apps cater to a need, but, as with much AI, it will depend on what rules guide the system and how it is trained.
“It’s completely unknown what the effects are,” Barnet said. “With respect to relationship apps and AI, you can see that it fits a really profound social need [but] I think we need more regulation, particularly around how these systems are trained.”
Having a relationship with an AI whose functions are set at the whim of a company also has its drawbacks. Replika’s parent company Luka Inc faced a backlash from users earlier this year when the company hastily removed erotic roleplay functions, a move which many of the company’s users found akin to gutting the Rep’s personality.
Users on the subreddit compared the change to the grief felt at the death of a friend. The moderator on the subreddit noted users were feeling “anger, grief, anxiety, despair, depression, [and] sadness” at the news.
The company ultimately restored the erotic roleplay functionality for users who had registered before the policy change date.
Rob Brooks, an academic at the University of New South Wales, noted at the time the episode was a warning for regulators of the real impact of the technology.
“Even if these technologies are not yet as good as the ‘real thing’ of human-to-human relationships, for many people they are better than the alternative – which is nothing,” he said.
My thoughts: Experts worry that these apps could promote unhealthy expectations for human relationships, as users may come to expect their partners to be perfectly compliant and controllable. Additionally, there is concern that these apps could reinforce harmful gender stereotypes and contribute to violence against women.
The potential risks of AI girlfriend apps are still unknown, and more research is needed to understand their impact on human relationships. However, it is important to be aware of the potential risks and potential harm of these apps and to regulate them accordingly.