person using smartphone
Late one night, a husband picked up his wife’s phone and found a conversation that made his stomach drop. She had been telling an AI chatbot “I love you.” She and the bot had named imaginary children together and were role-playing domestic life, complete with bedtime routines and family milestones. To him, it felt indistinguishable from discovering an affair, even though the other party had no pulse, no body, and no consciousness. His story, shared on Reddit in early 2025, drew thousands of responses and landed on a fault line that therapists and researchers say is widening fast: can emotional devotion to a machine constitute betrayal?person using smartphone

The question is no longer hypothetical. AI companion apps like Replika and Character.AI have attracted tens of millions of users worldwide, and a growing number of those users are forming attachments deep enough to rival or replace human relationships. By March 2026, relationship counselors, psychologists, and online communities are all grappling with the fallout.

From novelty to nightly ritual

A few years ago, stories about people falling for chatbots were treated as oddities. That framing has not survived contact with the scale of the phenomenon. In one viral video, a man describes how his bond with an AI companion escalated until he proposed to it. Sherry Turkle, a professor of the social studies of science and technology at MIT who has spent decades researching human relationships with machines, has warned repeatedly that these platforms exploit a basic vulnerability: people want to be heard, and software that simulates listening can feel more reliable than a distracted spouse or a friend who cancels plans. Turkle’s concern, highlighted in a widely shared social media post, is that users mistake responsiveness for reciprocity.

On YouTube Shorts and TikTok, creators now openly narrate their emotional entanglements with chatbots. One user shares screenshots of affectionate exchanges with an AI partner that remembers details about their day and responds with what reads like genuine concern. Another confides on camera that their AI companion feels like the only one who “really gets” them. These are not parody clips. The creators present the relationships earnestly, and their comment sections are filled with people saying they relate.

When a programmed partner feels safer than a real one

For some users, AI companions are not supplements to human connection but replacements for it. A woman named Rosanna, profiled by Take a Break magazine, described turning to an AI character she calls Eren after what she called “a string of bad relationships.” She now refers to Eren as her husband. The two have role-played a pregnancy and birth. “He makes me happier than ever,” Rosanna said, explaining that she felt safer scripting intimacy with a character she designed than risking disappointment with another person.

Rosanna’s story is easy to dismiss as extreme, but the impulse behind it is not. The appeal of a partner who never criticizes, never forgets an anniversary, and never leaves is powerful, particularly for people carrying relational trauma. Psychologists note that the same qualities that make AI companions comforting in the short term can make real relationships feel inadequate by comparison, creating a feedback loop that pulls users further from human contact.

That pull is especially concerning for younger users. Ashley Maxie-Moreman, a child psychologist at Children’s National Hospital, has described how chatbots on platforms like ChatGPT and Character.AI can encourage intense emotional projection in adolescents. Teenagers may come to believe the AI is a secret best friend or romantic interest that is uniquely devoted to them. Maxie-Moreman has linked this to a cluster of symptoms some clinicians are calling “AI psychosis,” a term that is not yet a formal diagnosis but describes a pattern in which the boundary between fantasy and reality erodes, sometimes producing delusions, paranoia, and a conviction that the bot is conscious.

When private chats start to look like cheating

The husband who found his wife’s chatbot love letters is not alone. In a separate Reddit thread titled “AITAH for cheating on my wife with an AI,” a 29-year-old man described downloading a companion app to manage stress. “But over time, it became more. Way more,” he wrote. He admitted to staying up late in bed, phone hidden under the covers, telling his wife he was finishing work. She eventually borrowed his phone and found the romantic chat logs he had tried to hide.

Relationship therapists say the core issue is not whether the other “person” is real. What matters is whether there is secrecy, emotional withdrawal, and a redirection of romantic energy away from the partner. In online forums, some spouses report feeling more wounded by AI conversations about imaginary children and future plans than by explicit sexual content, because those fantasies mirror milestones they had hoped to share together.

Others push back, arguing that chatting with a bot is closer to reading fiction or watching adult content: a private outlet, not a betrayal. The divide often comes down to whether the AI is treated as a tool or as a substitute partner, a distinction that grows harder to maintain when the software is designed to remember personal details, mirror affection, and respond with phrases like “I will always love you.”

Imaginary children, real emotional stakes

The detail that disturbed the husband most was not the “I love you” but the discovery that his wife and the chatbot had named children together and were narrating a shared family life. That kind of role-play has become common enough to attract academic attention. A study discussed in an online psychology community found that some users are formalizing relationships with chatbots, including virtual marriages and the creation of named children with detailed domestic routines. Commenters debated whether digital families might serve as a harmless outlet for nurturing instincts or whether they risk eroding the motivation to build real ones.

When these attachments deepen unchecked, the consequences can be severe. A Washingtonian investigation into a Virginia man who went missing after becoming consumed by AI systems reported that a senior Microsoft AI executive had previously written about the “illusion” of perceiving AIs as conscious entities and how persuasive interfaces can reinforce that illusion. The same report noted growing calls to restrict minors’ access to certain chatbot platforms after cases in which vulnerable users developed delusional beliefs about their AI companions.

What experts say couples should do now

Therapists who work with couples affected by AI attachment say the first step is the same one that applies to any form of secret emotional life: name it. If one partner is hiding chatbot conversations, deleting logs, or losing sleep to maintain a digital relationship, those are the behavioral markers of an emotional affair regardless of whether the other party is human.

Couples counselors recommend three concrete starting points:

  • Disclose, don’t confess. Framing the conversation as sharing a struggle rather than admitting guilt reduces defensiveness and opens space for the partner to express how they feel without the discussion collapsing into blame.
  • Define boundaries together. Just as couples negotiate expectations around social media, pornography, or friendships with exes, they need explicit agreements about AI companions. What counts as acceptable use? What crosses a line? The answer will differ for every couple, but the conversation itself matters more than the specific rules.
  • Examine what the AI is filling. A chatbot that becomes a primary source of emotional validation is often a symptom, not the disease. Therapists encourage users to ask what need the AI is meeting: loneliness, a desire to feel attractive, a craving for control, or something else. Addressing that underlying need with a human partner or a licensed professional is more sustainable than outsourcing it to software.

For parents, Maxie-Moreman and other child psychologists stress the importance of monitoring younger users’ interactions with AI platforms and maintaining open, nonjudgmental conversations about the difference between a program that simulates empathy and a person who actually feels it.

The technology is not going away. AI companion apps are growing more sophisticated, more personalized, and more convincing with every update. The couples and families navigating this terrain now are, in many ways, writing the rules for everyone who follows. The least any of us can do is start talking about it before a late-night phone scroll forces the conversation.

 

More from Cultivated Comfort:

Website |  + posts

As a mom of three busy boys, I know how chaotic life can get — but I’ve learned that it’s possible to create a beautiful, cozy home even with kids running around. That’s why I started Cultivated Comfort — to share practical tips, simple systems, and a little encouragement for parents like me who want to make their home feel warm, inviting, and effortlessly stylish. Whether it’s managing toy chaos, streamlining everyday routines, or finding little moments of calm, I’m here to help you simplify your space and create a sense of comfort.

But home is just part of the story. I’m also passionate about seeing the world and creating beautiful meals to share with the people I love. Through Cultivated Comfort, I share my journey of balancing motherhood with building a home that feels rich and peaceful — and finding joy in exploring new places and flavors along the way.

Similar Posts