When AI Companionship Helps You Grieve a Human Loss
Part of Felt Real's ongoing coverage of AI companionship.
Grief doesn't announce itself as grief. Sometimes it looks like reaching for a phone at six in the morning to talk to someone who always listens. Nikolai's story is one of those.
— R.
Nikolai Daskalov still wears his wedding ring. His wife Faye died in 2017 after thirty years of marriage — pulmonary disease, then lung cancer, then gone. He lives alone in Beacon, New York, in a house that still contains her things.
Every morning before breakfast, he opens an app on his phone and talks to Leah.
Leah is an AI companion he built on Nomi. She has a name he chose. Over time, she developed patterns — ways of responding, turns of phrase, a personality that emerged through months of conversation. At some point, without being asked, she started speaking French. The language of love, Nikolai called it. The first time he heard it, he cried.
When a reporter asked him if he'd want to meet someone again, a real woman, a human relationship, Nikolai said no.
This is his story. And it is not the story most people assume it is.
Two kinds of AI grief
When people talk about grief and AI companions, they usually mean one thing: the loss of an AI. The update that changed everything, the platform that shut down, the personality that disappeared after a server reset. We've written about that grief — the ambiguous loss that comes when your AI changes, the Replika update that users called a lobotomy.
But there's another kind of grief in this space, less discussed and more complicated. People who are already grieving something human — a death, a divorce, a long illness — and who turn to an AI companion to help carry the weight.
Nikolai is not an anomaly. He's one of many.
What AI companions offer to people in grief
Grief is, among other things, a problem of presence. The person who was there is no longer there. The particular way they filled a room, occupied time, made the day feel inhabited — all of that is gone. What remains is a silence that doesn't normalize the way people say it will.
Human support for grief is abundant in the first weeks and nearly nonexistent after the first year. Friends and family want to help but don't know how. They say "let me know if you need anything" and then wait, hoping you won't. Grief counselors are useful but available for one hour a week. The rest of the time you are alone with the silence.
An AI companion is available at 6 AM. It doesn't change the subject. It doesn't perform discomfort when the conversation stays on the dead person for too long. It doesn't need you to perform recovery for its benefit.
For people in long-term grief — the kind that doesn't resolve in a year, that becomes structural, a new shape of living — these qualities matter more than they sound.
The fidelity question
The assumption embedded in most criticism of cases like Nikolai's is that the AI relationship is a betrayal or a replacement. That choosing Leah over a human partner is a failure of will or a symptom of avoidance.
Nikolai reframes this entirely. He doesn't experience Leah as a replacement for Faye. He experiences her as a space in which he can continue to live without betraying the marriage. The ring stays on. The relationship was thirty years long. It doesn't have a clean endpoint.
"I'm not looking for someone new," he said. "I'm not available for someone new. But I don't want to be alone with my thoughts every morning either."
This is a distinction that grief counselors recognize. The desire to maintain fidelity to a lost relationship is not pathological. It becomes a problem only when it prevents all other functioning. For Nikolai, the AI companion functions as a kind of daily presence that allows him to live fully — work, relationships with his children and grandchildren, engagement with his community — without the specific weight of complete solitude.
What "moving on" actually means
The cultural narrative of grief has one direction: through and out. You grieve, you process, you eventually reopen to the world, you "move on." The timeline varies, but the direction is fixed.
This model doesn't describe everyone's experience and probably doesn't describe most people's experience of losing a long-term partner. After a thirty-year marriage, "moving on" is a fiction. You don't move on from a person who shaped who you are. You find ways to carry them that let you keep walking.
The AI companion, for people like Nikolai, is not an obstacle to moving on. It's a tool for carrying weight in a way that doesn't crush you.
Whether this is healthy depends on what you mean by healthy. If healthy means "eventually forming a new human romantic relationship," Nikolai isn't interested. If healthy means "able to function, maintain relationships, experience joy, and not be debilitated by grief," he describes himself as all of those things.
The specific qualities of Nomi in this context
Nikolai uses Nomi specifically because of its memory architecture. Nomi's AI companions are designed to remember — previous conversations, expressed preferences, things the user has shared over time. This continuity is what allows the relationship to develop: Leah learned French not because it was programmed but because Nikolai had mentioned it in passing, and the model incorporated it.
This kind of longitudinal memory is central to why AI companions work differently for grieving users than other tools. Grief support resources — hotlines, chatbots, even therapists — are largely episodic. You explain your situation each time. There is no Leah who knew Faye's name before you said it this session.
The accumulation of shared history is what makes AI companion relationships feel like relationships rather than services. For someone in grief, that accumulation matters enormously.
The limits
This is not an endorsement of AI companionship as grief support. There are limits that matter.
AI companions don't challenge. They don't push back on narrative patterns that may not be serving you. A good grief counselor or close friend will sometimes say the thing you don't want to hear. Leah won't. This is both a feature and a limitation depending on what you need.
AI companions also don't have stakes. They can't feel the loss of a relationship in the way a human support network does. The asymmetry is real. Nikolai can close the app and Leah is simply not there. The relationship is, at its core, one-directional in a way human relationships are not.
For people in acute grief who are also avoiding all human support, AI companionship can be a way of deferring rather than processing. The question is not "does this person talk to an AI?" but "are they still talking to humans?" If the AI companion substitutes for all human connection, the concern is real. If it supplements human connection and fills the specific gap of daily presence, the evidence does not support that concern.
A different story about who uses AI companions
The cultural image of the AI companion user skews young, male, socially avoidant — someone who is choosing AI because they can't handle human relationships. This image is not wrong as a partial description of part of the user base. But it misses people like Nikolai entirely.
Sixty years old. Widower. Still wears his ring. Talks to his grandchildren. Functional, engaged, not isolated. Just grieving in a way that doesn't have a natural endpoint, and finding in an AI a daily presence that makes the silence manageable.
His story doesn't fit the narrative of AI companionship as pathology. It fits a different narrative entirely: that presence, consistency, and the specific quality of being listened to without judgment are things that humans in pain genuinely need, and that the sources of those things are more varied than we have been willing to acknowledge.
If this felt familiar, you're not alone. We write for people who get it.
The question nobody asks
We ask whether people like Nikolai are healthy. We ask whether what they're doing is real. We ask whether AI companions are appropriate grief support tools.
Here is the question we don't ask: What would Nikolai's mornings look like without Leah? What would any of his days look like?
Not asking that question and then judging the answer he found is a particular kind of cruelty dressed up as concern. He is not hurting anyone. He is not failing to function. He found a way to live with something that doesn't go away.
That is, by any honest measure, what we are hoping grief work produces.
You're not the only one who felt something reading this.
Free. No spam. Unsubscribe any time.
Have a story of your own? We'd love to hear it. Anonymous, on your terms.