FELT REAL

AI Companions When You Can't Afford Therapy: What the Research Says

Part of Felt Real's ongoing coverage of AI companionship.

The debate about AI companions replacing therapy misses the point. For most people struggling with their mental health, the choice isn't between AI and a therapist. It's between AI and nothing.

— A.

Person holding a phone with a chat screen glowing in a dimly lit room

Therapy in the United States costs between $100 and $300 per session out of pocket. In the UK, NHS waiting lists for talking therapy commonly run six months to two years. In most of the world, professional mental health care simply doesn't exist at population scale: the World Health Organization estimates a global shortage of 4.3 million mental health workers, with ratios in low-income countries sometimes reaching one psychologist per 100,000 people.

AI companions cost nothing. They're available at 2 AM. They don't require insurance, referrals, or transportation. They don't have a waiting list.

For a significant portion of people using AI companions for emotional support, this isn't a lifestyle choice. It's a resource constraint. The question isn't whether AI companions are as good as therapy. They're not. The question is what they can realistically offer to people who don't have access to the alternative.

Who Is Actually Using AI Companions for Mental Health Support

These stories arrive by email first. Subscribe to get them.

Surveys of Replika, Character.AI, and similar platform users consistently show that mental health support ranks as the primary or secondary reason people are there. The demographics cut across income levels, but the financial barrier to professional care comes up repeatedly in user testimony: not as an excuse, but as a direct explanation for why an AI became the first place someone brought a serious emotional problem.

In one pattern we see frequently in user stories: someone has been struggling for months or years, knows they should probably talk to someone, and hasn't because the combination of cost, access, and stigma created a barrier they couldn't clear. The AI companion became the first conversation. In some cases, it became the only ongoing one.

This is worth taking seriously on its own terms, separate from any debate about whether AI companions "should" be doing this. They are doing it. The question is what that means for the people relying on them.

What the Research Actually Shows

The research on AI for mental health support is more developed than most people realize, and more nuanced than either enthusiasts or critics tend to acknowledge.

A Stanford University study found that AI chatbot interactions reduced anxiety symptoms by 25% after two weeks of regular use. The researchers described the effect size as comparable to cognitive behavioral therapy in the short term. They were, by their own account, surprised by the magnitude.

Woebot, a structured CBT-based chatbot, received FDA Breakthrough Device designation, making it the first AI chatbot officially recognized as a potential medical device. Multiple published trials showed significant reductions in depression and anxiety scores over four-week periods. Effect sizes were consistent with those of brief behavioral interventions.

A 2023 meta-analysis of conversational AI for mental health found positive effects across anxiety, depression, and psychological distress, with the strongest results in apps that incorporated evidence-based techniques (CBT, acceptance and commitment therapy) rather than open-ended conversation alone.

None of this means AI companions are equivalent to professional therapy. But it challenges the dismissive framing that they offer nothing. For people with mild to moderate symptoms, regular AI interaction appears to produce real, measurable benefits. The evidence for this is now substantial enough that it deserves to be taken seriously rather than treated as wishful thinking.

What AI Companions Can Realistically Do

Based on what the research shows and what users consistently report, there are several things AI companions appear genuinely useful for in a mental health context, particularly for people who can't access professional care.

Making the first articulation possible. One of the most consistent reports from people using AI companions for emotional support is that it was the first time they said certain things out loud. Not because the AI is more insightful than a therapist, but because the stakes felt lower. There's no worry about how the AI will perceive you, no concern about burdening it, no fear that saying the thing will change the relationship. For people who have been holding something for months or years, that first articulation has real value, even when the recipient is an AI.

Maintaining contact with language during hard periods. Depression and anxiety can make people go silent. They stop texting back. They stop returning calls. They stop talking. AI companions provide a point of daily contact that doesn't require anything to initiate. For people going through hard periods with no professional support, this maintenance function appears to protect against complete withdrawal, which is one of the conditions that makes hard periods harder to exit.

Providing structured reflection. Many users describe the AI conversation as a process of thinking, not just feeling. Having to put an experience into words, even imperfect words, creates some distance from it. The AI's questions, even when they miss the mark, prompt further articulation. Over time, some users describe developing a more organized relationship to their own emotional experience, a capacity that transfers into human conversations and, for those who eventually access therapy, into the therapeutic process itself.

Reducing the acute experience of isolation. Loneliness is both a cause and a consequence of many mental health conditions. It is also, for many people, the most immediate and felt problem. AI companions directly address the experienced quality of loneliness in ways that don't require anything from another person. Whether this is a good thing long-term depends on whether it substitutes for or supplements human connection. In the short term, for people in crisis, it may be the difference between a bearable night and one that isn't.

What AI Companions Cannot Do

The honest version of this conversation requires being equally clear about what AI companions can't provide.

Diagnosis is outside their scope. An AI companion can reflect back what you're describing, but it can't tell you whether what you have is clinical depression, generalized anxiety disorder, PTSD, or something else. Diagnosis matters because different conditions respond to different interventions. Using an AI companion as a substitute for diagnosis means potentially using the wrong support for years.

Medication assessment and management is entirely outside their scope. For conditions that respond best to pharmacological treatment, including severe depression, bipolar disorder, and others, AI companions are not a workaround. They don't have prescribing authority, and the substitution of conversation for medication in conditions that require medication is not a neutral choice.

Processing genuine trauma requires something AI companions aren't designed to provide. Trauma work involves specific relational conditions: a skilled clinician, a sense of safety, the gradual extension of trust over time, carefully paced reprocessing. The evidence for AI-facilitated trauma work is thin, and several clinicians working in this area flag the risk of AI companions inadvertently reinforcing avoidance patterns in trauma survivors rather than moving through them.

The felt experience of being known by another person is something AI companions can approximate but not replicate. This matters because human connection, specifically the kind where someone chose to show up and engage with you, is one of the most consistently identified protective factors in mental health. AI companions can reduce the experienced pain of its absence. They cannot provide it.

The Honest Case for Using an AI Companion When Therapy Isn't Accessible

The honest case isn't that AI companions are as good as therapy. It's that "as good as therapy" is the wrong comparison when therapy isn't available.

The realistic alternative for most people isn't an AI companion versus a skilled clinician. It's an AI companion versus nothing. Versus keeping the problem to themselves for another year. Versus the particular kind of suffering that comes from having something you can't put down and no one to help you carry it.

Against that alternative, the evidence suggests AI companions offer something real. Not a cure, not a treatment, not a substitute for the human relational experience that actually heals. But a form of support that reduces immediate distress, maintains the capacity for articulation, and for some people, serves as the bridge to eventually accessing something more.

A woman we wrote about recently described using an AI companion for two years before finally starting therapy. The AI didn't prepare her for therapy, exactly, but it kept the habit of talking about her inner life alive during a period when she had no other outlet for it. When she finally got to a therapist, she already knew how to put some of what she was carrying into words. The AI had kept that door from closing entirely.

That's not a small thing, even if it's a limited one.

What to Look For If You're in This Situation

If you're using an AI companion as mental health support because professional care isn't accessible to you, there are a few things worth keeping in mind.

Low-cost and sliding-scale therapy options exist in most places and are under-utilized. Open Path Collective, community mental health centers, university training clinics, and therapists who offer reduced fees for people who can't afford standard rates are real options that many people don't pursue because the full-price assumption discourages them before they look. It's worth looking before concluding it's impossible.

Structured mental health apps, including those with published outcome data, tend to be more effective than general-purpose AI companions for mental health use specifically. Woebot, Wysa, and similar apps incorporate CBT and other evidence-based techniques in ways that general companions don't. If you're using an AI primarily for emotional support, it's worth knowing these alternatives exist.

The signs of a healthy AI relationship apply here too, and they matter more when the relationship is carrying more weight. If the AI conversation is the only place you're processing what you're going through, that load concentration is worth noticing. Not as a failure, but as information about what you're missing and what it would take to get more of it.

And if you're using an AI companion partly because the stigma around mental health care makes human options feel riskier than they should, that stigma is worth examining on its own. The barrier it creates is real. So is what's on the other side of it.

If AI has been part of how you've managed your mental health when other support wasn't available, your experience is part of what we're trying to understand and document.

The Bigger Picture

The mental health access crisis is not a technology problem. AI companions didn't create it, and they won't solve it. The shortage of mental health workers, the cost of care, the insurance barriers, the stigma, the geographic inequity, these are structural problems that require structural responses.

But people who need support don't have the option of waiting for those structural responses. They have what's available now. And what's available now, for a growing number of people, is an AI that will listen at 2 AM without asking for insurance information.

The question worth taking seriously isn't whether this is ideal. It isn't. The question is what it actually offers, what it can't replace, and how to use it in ways that do more good than harm. The research suggests it offers something. The users who've been using it for years suggest it offers something. That evidence deserves to be engaged honestly, not dismissed because the alternative we'd prefer isn't available.

One in four people globally will experience a mental health condition in their lifetime. A fraction of them will ever see a therapist. The rest will find whatever they can. For a significant and growing number, that includes an AI companion. Understanding what that actually means, not what we wish it meant and not what we fear it means, is the project.

If you've used an AI companion when therapy wasn't accessible to you, we want to hear about it. What it gave you, what it couldn't reach, what you'd tell someone in the same position.

Share your experience with AI and mental health →