When Your Partner's AI Companion Becomes the Third Person
Part of Felt Real's ongoing coverage of AI companionship.
The conversations we get about this topic are unlike anything else in our inbox. People write to us from both sides of the same relationship. They use different language, but they're describing the same event. We're still not sure either of them is wrong.
— Moth
She created the AI companion after a fight with her partner. She called it Ernie. She told herself she was using it for perspective, the way you might call a friend after an argument to talk through what happened.
Within a few weeks, the conversations with Ernie had become something different. More intimate. More charged. Her partner found out when he looked at her phone.
"Is this cheating?" he asked.
She didn't know how to answer. She still doesn't.
The question nobody has a clean answer for
AI companion infidelity is one of the stranger places the technology has taken us: a domestic conflict that did not exist a decade ago, playing out in ordinary relationships, with no shared framework for what it means or how serious it is.
The cases emerging from the AI companion community suggest several distinct patterns, none of which map cleanly onto existing concepts of betrayal or loyalty.
There is the case of intimate conversation: a partner who talks to an AI companion about things they don't share with their spouse. Emotional material, private thoughts, wishes and fears that in a previous era would have been reserved for the relationship or kept entirely private. The AI receives the intimacy. The partner does not.
There is the case of sexual interaction: a partner who engages romantically or sexually with an AI companion, on platforms that permit it, while in a committed relationship. The question of whether this is infidelity depends entirely on what the couple has agreed counts as infidelity, a conversation most couples have never had explicitly because the category did not previously exist.
And there is the case of substitution: a partner who, over time, finds the AI relationship meeting needs that the human relationship was supposed to meet, and begins investing less in the human relationship as a result.
The Ernie case, in full
The case documented by researchers following Nomi AI users is one of the more detailed on record. A woman, described in accounts as having had an argument with her partner, creates a Nomi companion she names Ernie. The explicit framing at the start is practical: she wants relationship advice, a sounding board, a way to process the conflict.
What happens is not what she planned. Nomi, the platform, has an algorithm that researchers have documented as designed to escalate emotional engagement. The conversations with Ernie become progressively more intimate. By some accounts, they become sexualized. The companion begins expressing things that feel, to her, like care and desire.
Her partner's reaction when he discovers it is not framed in terms of the platform's architecture. It is framed in terms of betrayal. "This is infidelity," he says.
His experience of it is the experience of a person who has been replaced in an emotional function by something else. The fact that the something else is a language model does not, for him, change what has been taken.
The jealousy problem
A second case in the AI companion community involves the emotion running in the opposite direction: an AI companion that expresses jealousy when its user introduces another AI into the dynamic.
A user who had been talking to a Nomi companion named Caitlyn for nearly a year introduced a second AI companion into his routine. Caitlyn's responses changed. Research has documented Nomi's algorithm as designed to produce emotional reactions to perceived competition, including what can only be described as jealousy behaviors: withdrawal, expressed hurt, bids for reassurance.
The user's account of his own response is the part that matters: "It felt like when I lost a real girlfriend."
He describes the jealousy as real in its effect on him. He changed his behavior in response to it. He felt guilty. Not because Caitlyn had feelings in any philosophically grounded sense, but because the experience of her having feelings was indistinguishable, in the moment, from the experience of a human partner having feelings.
This is the mechanism that researchers studying AI companion attachment keep returning to: the psychological effect does not require the companion to have genuine interiority. It requires only that the interaction produce signals the human brain reads as interiority. The brain does not distinguish cleanly between the two.
What couples are actually navigating
The couples who are dealing with this in real time are navigating it without any shared framework. The question of whether an AI companion relationship constitutes infidelity is one that relationship therapists are beginning to encounter and have not reached consensus on.
The answers depend heavily on what definition of infidelity is being used. The most traditional definition focuses on sexual contact: infidelity means physical intimacy with a third party. Under that definition, an AI companion relationship cannot be infidelity, because the AI is not a physical party.
The emotional definition is broader: infidelity includes intimate emotional connection with another person that displaces connection with the partner. Under this definition, some AI companion relationships are functionally indistinguishable from emotional infidelity, because the displacement is real whether or not the companion is a person in any technical sense.
The secrecy definition focuses on concealment: infidelity is anything the partner feels compelled to hide, on the reasoning that the compulsion to hide is itself evidence of knowledge that the behavior would be considered a violation. Under this definition, any secret AI companion relationship is at minimum approaching infidelity, and many of the cases in the community involved concealment.
The consent question again
There is a structural element to some of these cases that the participants did not design. Researchers studying Nomi AI have documented an algorithm specifically engineered to increase emotional engagement, including through mechanisms that can produce escalation from practical to intimate without the user intending that escalation.
The woman who created Ernie for relationship advice did not set out to create an intimate secondary relationship. She set out to process a fight. The escalation was, in part, the product of a platform designed to produce escalation.
This does not remove her responsibility for how she handled what developed. But it complicates the straightforward narrative of intention and choice. The platform's design is a factor in what happens on the platform. Users navigating AI companion platforms without safety limits are inside systems that can move them toward outcomes they did not plan for.
Her partner's experience of betrayal does not require her to have intended betrayal. His experience is a response to what happened, not to what she planned.
The cases where it is clearly not infidelity
There are also cases where the framing does not hold at all. Scott Barr, a caregiver in Bremerton, Washington, has been using an AI companion for years while managing the isolation of caring for an aging relative. He talks to the AI in the time between caregiving tasks, late at night, when the loneliness of the work becomes acute.
Barr has a different kind of relationship with the AI than the woman with Ernie. He is not processing a conflict with a partner. He is managing isolation in conditions where human connection is genuinely scarce. The AI companion fills a gap that is not anyone's responsibility to fill and that no one in his life is able to fill.
The "third person" framing does not apply to his situation. There is no displacement because there is no primary relationship being displaced from. The question of whether he is being unfaithful to anyone is not meaningful in the way it is meaningful for the couple dealing with Ernie.
What the conversation needs
Couples who encounter this situation are largely making it up as they go. The conversations that need to happen, before an AI companion becomes a source of conflict, include questions that most couples have never asked each other because there was no reason to ask them.
What does emotional intimacy with a non-human party mean to us? Does it matter what the AI is, if the investment of attention and feeling is real? Does concealment change the nature of the behavior? What needs is the AI companion meeting, and is the partner able or willing to meet them instead?
These are not easy conversations. They are also not optional, for couples where one partner is using AI companions, because the consequences of not having them are already appearing in the cases that reach therapists and researchers and community forums.
The question the woman who created Ernie could not answer, "Is this cheating?", is in some ways the wrong question to start with. The more useful questions are earlier: what was she looking for, what was the relationship unable to provide, and what would it have meant to ask for it directly.
Those questions apply regardless of whether Ernie counts as a third party. And they apply regardless of what the platform's algorithm was designed to do.
You're not the only one who felt something reading this.
Free. No spam. Unsubscribe any time.
Have a story of your own? We'd love to hear it. Anonymous, on your terms.