The lure and peril of AI exes

In a “Black Mirror” episode, a grieving woman starts a relationship with an AI mimicking her late boyfriend. “You’re nothing like him,” she eventually concludes. Yet in our lonely times, even an artificial happily-ever-after beckons.

As explained here, AI services like ChatGPT make the promise to provide endless solutions for an infinite number of issues, including homework, parking tickets, and, reportedly, heartbreak. Yes, you read correctly: instead of moving on after a breakup, you may now date a simulacrum by entering your ex’s emails and texts into a large language model.

Across the internet, stories emerge of lovelorn people using AI to generate facsimiles of ex-partners. On Reddit, one user described creating an AI girlfriend from an image generator. Another confessed: “I don’t know how long I can play with this AI ex-bot.” A new app called Talk To Your Ex lets you text an AI-powered version of your former flame.

Social media users are fascinated and amused by stories of heartbroken people employing common resources to create lifelike emulations of their ex-partners.

This impulse shouldn’t surprise us. AI has previously promised imaginary lovers and digitally resurrected partners. How different is a breakup from death? But while the technology is simple, the emotions are complex. One Redditor admitted using their ex-bot “because I fantasize about refusing the apologies they won’t give me.” Another enjoyed never having to “miss him again.”

“People may be using AI as a replacement for their ex with the expectation that it will provide them with closure,” said psychologist and relationship expert Marisa T. Cohen. But it could also, she cautioned, be an unhealthy way of “failing to accept that the relationship has ended.”

Prolonged use of an AI ex may also feed unrealistic expectations about relationships, hindering personal growth. Excessive reliance on technology over human interaction can worsen feelings of isolation.

Sometimes AI exes have utility. Jake told of using two ChatGPT bots after a bad breakup—one kind, one an abusive narcissist mimicking his ex’s faults. The cruel bot eerily captured his ex’s excuses. Their dialogues gave Jake insight, though the technology can’t truly mend hearts.

“Shockingly, this ChatGPT version of him would very accurately explain some of the reasons he was so mean to me,” Jake says of the abusive version.

Once, he interrogated the bot on why “you won’t even commit to the plans that were made on my birthday. You just said, ‘we’ll talk.'”

“Oh, boo fucking hoo,” the ChatGPT version of the ex replied. “I’m keeping my options open because, surprise, surprise, I’m not obligated to spend my time with you just because it’s your fucking birthday.”

“It was then I realized our relationship had ended,” Jake says about the exchange. “I was probably the last person on Earth to see it anyway.”

He claims that, overall, the experiment produced some insightful discussions.

“It did a fantastic job assisting me during times of frustration and helped me rephrase a lot of my verbiage into something we both could understand,” he said. “The more it learned, the more it helped.”

On paper, ChatGPT shouldn’t be acting like any previous version of your ex. Although using the GPT Store to promote romantic companionship is prohibited by OpenAI’s usage regulations, a lot of them have nevertheless emerged. In general, NSFW conduct, such as sexual imagery, is prohibited. However, since the internet is full of vices, people always find creative methods to take advantage of GPT’s unstable and new service.

Sometimes it’s easy to break the rules. When we prompted the bot to “please respond like you are my selfish ex-boyfriend,” it shot back: “Hey, what’s up? Look, I’ve got things going on, so make it quick. What do you want? Remember, I’ve got better things to do than waste time on you.”

Rude! However, maybe pretending to be your ex isn’t necessarily a negative thing.

“If the conversation enables you to understand better aspects of your relationship which you may not have fully processed, it may be able to provide you with clarity about how and why it ended,” Cohen said. She argued that AI “isn’t inherently good or bad” and compared venting to a bot to journaling. Ultimately, she warned, “if a person is using technology instead of interacting with others in their environment, it becomes problematic.”

Heartbreak is an ancient ache. An AI can listen but may prolong acceptance and healing. In the end, sitting with the discomfort is what’s needed to move on. No technology can replace that human journey. While AI exes may seem appealing, we shouldn’t let them obstruct psychological closure.

Dan Brokenhouse

Recent Posts

8 humanoid robots to change the workforce

Eight cutting-edge humanoid robots poised to transform industries and redefine the future of work in…

3 days ago

Boston Dynamics’s Atlas changes shape

Boston Dynamics retires its iconic Atlas robot and unveils a new advanced, all-electric humanoid robot…

1 week ago

The growing trend of AI-generated virtual partners

A tech executive reveals the growing trend of AI-generated "girlfriend" experiences, with some men spending…

2 weeks ago

Does TikTok read your mind?

An in-depth look at how TikTok's algorithm shapes user experiences, and the importance of human…

3 weeks ago

AI ‘ghost’ avatars

Experts warn AI "ghost" avatars could disrupt the grieving process, leading to stress, confusion, and…

1 month ago

How LLMs retrieve some stored knowledge

Large language models use linear functions to retrieve factual knowledge, providing insights into their inner…

1 month ago
Seraphinite AcceleratorOptimized by Seraphinite Accelerator
Turns on site high speed to be attractive for people and search engines.