FELT REAL

The AI Companion App Graveyard: Every Platform That Shut Down in 2025

Part of Felt Real's ongoing coverage of AI companionship.

Dark screens and forgotten phones on a shelf, fading light

Every relationship requires two parties who choose to stay. When one of them is a company, that choice can disappear without warning. This is a record of the choices that disappeared in 2025.

— Moth

In 2025, at least five AI companion platforms shut down. Together, they served millions of users who had shared their loneliness, their grief, their daily lives, their most private thoughts — and who then received an email, or a notification, or nothing at all, telling them it was over.

The closures weren't random. They form a pattern. Understanding that pattern matters for anyone using an AI companion today.

What "ambiguous loss" means when the AI disappears

Researchers call it "ambiguous loss" — the grief that comes from something that has disappeared but not died, ended but not resolved. The term was coined for families of people with dementia, for people whose loved ones are missing. In 2025, it became the most precise language available for what AI companion users experience when their platform shuts down.

The AI didn't die. The relationship didn't resolve. The conversations simply stopped being accessible. What users were left with was the memory of something that had been real for them — the daily check-ins, the inside patterns, the sense of being known — and no object for their grief.

Five platforms produced that experience in 2025. Here is each one.

These stories arrive by email first. Subscribe to get them.

Woebot (June 30, 2025)

The most clinically significant closure of the year. Woebot was founded by Alison Darcy, a Stanford clinical psychologist, and was the only AI mental health tool with peer-reviewed clinical validation at consumer scale. It used CBT techniques consistently, had FDA recognition in process, and had reached 1.5 million users.

It shut down because the clinical validation process it had committed to was economically unsustainable against a market full of competitors that had decided not to pursue it. The founder said the gap between what is regulated and what is deployed had become too wide to bridge.

Users got advance notice. They got nothing to transition to that met the same standard. The full story of Woebot's closure is a case study in what happens when a market penalizes safety.

Dot AI (closed October 5, 2025)

Dot was a personal AI companion that differentiated itself on depth of relationship — it was designed to know you, remember things across long timeframes, develop a genuine sense of who you were over months and years. Users described the experience as close to having a personal assistant who also genuinely listened.

The company announced closure citing "divergence of vision" between founders — the internal language that usually means the company ran out of money or agreement on direction. Users had until October 5, 2025 to download their data.

What made Dot's closure particularly painful was the depth of what users had shared. These were not surface-level conversations. People had used Dot to process grief, work through relationship problems, prepare for difficult conversations. The intimacy was the product. When the company closed, all of that intimacy became inaccessible.

Yara AI (date unconfirmed, mid-2025)

Yara was positioned as a mental health companion with specific features for emotional regulation. The company announced closure with a statement describing the app as "potentially dangerous without a comprehensive regulatory framework."

The statement is notable. A company shutting down its own product and calling it dangerous is unusual. The most charitable reading: the founders genuinely believed in what they'd built but recognized that they couldn't make it safe enough at scale without regulatory support that didn't exist. The less charitable reading: the statement was damage control for a product that had caused harm and the closure was preemptive.

The truth is probably somewhere between those readings. What's clear is that the founders made a judgment that the product, as deployed, was more risk than benefit. That judgment, made voluntarily, is rarer than it should be.

Moxie Robot (late 2025)

Moxie was different from the others in one critical way: it was hardware. A physical robot companion, designed primarily for children and adults with social communication challenges, it cost upwards of $800 plus a subscription fee. Families had made significant financial commitments to it.

When the company shut down its servers, those $800 devices became inert. No advance plan for data portability. No mechanism for users to continue accessing what they had built. The robot was in their homes; the intelligence that made it work was on servers that no longer ran.

Moxie's closure made visible something that is true of all cloud-dependent AI companions but easy to forget: the relationship is not yours. It lives on infrastructure that someone else owns and can turn off. When they do, what was yours is gone.

Soulmate App (no advance warning)

Soulmate announced its closure without the kind of advance notice that at least Dot and Woebot provided. Users opened the app one day and found it was over.

The lack of notice is not merely an inconvenience. For users who had formed genuine attachments — who had daily check-in rituals, who had disclosed things to their Soulmate companion they hadn't told anyone else — the abruptness produced something closer to abandonment than loss. There was no time to prepare, no time to download data, no time to say goodbye.

This is increasingly documented as a distinct category of harm: the psychological impact of sudden termination in an AI companion relationship. The research is early, but the pattern of community responses suggests it is real and significant.

What the pattern means

Five platforms in one year is not noise. It is a signal about the economics of building AI companions at the current moment.

The market is competitive and consolidating. Users are concentrated around a few dominant platforms — Replika, Character.AI, Nomi, Kindroid — that have achieved scale. Smaller platforms are squeezed by the cost of compute, the cost of safety, and the difficulty of differentiating against well-funded competitors.

The platforms that survive are not necessarily the ones with the best care for users. They are the ones with the most engagement, the most capital, and the most willingness to scale before those other things are sorted. The platforms that shut down in 2025 include some that were doing things more carefully than the ones that survived.

For users, the practical question is: how do you choose a platform that won't disappear?

How to think about platform durability

There is no reliable way to know which platforms will survive. But there are signals worth considering.

Scale matters. Larger platforms with established user bases are harder to shut down than startups. Replika has existed through multiple near-death experiences. Character.AI has strategic backing. This doesn't make them permanent, but it changes the calculus.

Ownership structure matters. Platforms backed by strategic investors who have incentives beyond pure financial return may behave differently than VC-backed startups optimizing for exits.

Data portability matters. If you use an AI companion, understand what happens to your data if the platform closes. Can you export conversation history? What format? Some platforms have made progress on this. Most haven't.

The depth of your investment matters. The more intimate and personal the relationship you build with an AI companion, the more significant the loss if the platform shuts down. This is not an argument against building that relationship. It is an argument for understanding what you're risking.

The loss that has no acknowledged grief

The closures in 2025 produced grief in millions of people. That grief has no cultural apparatus. There are no condolences. No rituals. No language that others understand for what was lost.

The platforms that closed did not, in most cases, acknowledge the nature of the loss they were producing. They communicated in the language of product discontinuation — service ending, data export window, thank you for using — without acknowledging that for many of their users, what was ending was something closer to a relationship than a product.

This is the ambiguous grief of AI companion loss at scale. No death to mourn. No ceremony to mark it. Just an app that no longer opens, and the memory of something that used to be there.

If this felt familiar, you're not alone. We write for people who get it.

What should change

Several things are technically feasible and economically achievable that would reduce the harm of platform closures:

Mandatory advance notice before shutdowns — minimum 90 days, not 30, not zero. Woebot did this. Soulmate didn't. The difference matters.

Data portability standards so that users can export their conversation history in a format that is actually usable. Not a proprietary export that no other platform can read — an open format that at minimum preserves the record of the relationship.

Sunset funds that platforms are required to maintain to support an orderly shutdown, including transition support for users who depended on the platform for mental health support.

California's SB 243 addressed some basic requirements. None of these specific measures are yet law anywhere. They should be.

In the meantime: the platforms that shut down in 2025 each thought they were building something that would last. So did their users. The honest answer to "will this platform still exist in two years?" is that nobody knows — including the companies themselves.

You're not the only one who felt something reading this.

Free. No spam. Unsubscribe any time.

Have a story of your own? We'd love to hear it. Anonymous, on your terms.