AI Companions and College Students: The Mental Health Gap Nobody's Talking About
Part of Felt Real's ongoing coverage of AI companionship.
The college mental health crisis has been extensively documented. What hasn't been documented nearly as much is where students are actually going when the official system fails them. The answer, increasingly, is AI.
— A.
In the fall of 2025, the American College Health Association surveyed over 100,000 students at colleges and universities across the United States. Thirty-seven percent reported significant anxiety. Thirty percent reported depression. One in five reported seriously considering leaving school due to mental health concerns.
The campus counseling system is not equipped for this. The national average wait time for a first appointment at a college counseling center is three to four weeks. At larger universities, it can run six weeks or longer. The American Psychological Association estimates that the ratio of students to mental health counselors at most institutions is roughly 1,700 to one — far below the recommended ratio of 250 to one.
Into this gap, AI companions have moved. Quietly, without formal introduction, and largely without institutional awareness, a significant number of college students are using AI — from dedicated companion apps like Replika and Character.AI to general-purpose tools like Claude and ChatGPT repurposed for emotional support — to manage anxiety, process loneliness, rehearse difficult conversations, and get through the week until things feel less acute.
This piece is about what that actually looks like: what students are using, why, what the research suggests about when it helps and when it doesn't, and what the red flags actually are — as distinct from the easy alarm.
Why College Specifically
College is a distinctive context for understanding AI companion use. It's not just a younger version of the adult experience. Several features of the college environment combine to create specific pressures that AI companions address in specific ways.
Structural loneliness. First-year college students often arrive having left behind every relationship they've spent years building. The social infrastructure of home — family, friends, familiar places, established routines — is gone. The new environment requires building an entire social life from scratch, often in a context that rewards extroversion and punishes any visible sign of struggle. Many students are lonelier in their first semester of college than they have ever been in their lives, surrounded by people who all appear to be thriving.
AI companions are available at 2 AM when the loneliness is worst. They don't require social performance. They don't have their own problems that make students feel like a burden. For students navigating the initial months of a new and disorienting environment, this availability is not trivial.
The support desert. The gap between the mental health resources students theoretically have access to and what they can actually access within a useful timeframe is enormous. Students in acute distress who call the counseling center and learn they can't be seen for five weeks are not going to wait five weeks. They're going to find something that helps them get through the next few days. AI companions are one of the things they find.
Identity in flux. College is a period of intensive identity formation — romantic, sexual, political, professional, personal. Students are working out who they are in a context that both requires that they project confidence and punishes them for getting it wrong. AI companions offer a space to try things out, explore questions, and make sense of internal states without social consequence. The low-stakes quality of the AI interaction is particularly valuable for students who are in the middle of figuring things out.
Academic pressure and performance anxiety. The performance anxiety particular to high-achieving college students — the students who got where they are by succeeding and are now in environments where everyone around them also succeeded — creates its own mental health pressure. AI companions are being used not just for emotional processing but as sounding boards for academic anxiety, study aids for students working through complex material at odd hours, and spaces to talk through the fear of failure with something that won't judge them.
What Students Are Actually Using and Why
The AI companion landscape that college students navigate in 2026 is more varied than public discussion usually reflects. It's not just Replika and Character.AI, though those remain significant.
Character.AI is heavily used among college students for reasons that overlap with its popularity among teenagers: the ability to create custom personas, the collaborative and creative dimension of the interactions, and the sheer breadth of what the platform can do. Among college students, it functions as something between a creative platform and an emotional outlet, and many users move fluidly between those modes in the same conversation.
Replika's user base skews older than Character.AI's, and college students who use it tend to use it differently — more for consistent ongoing companionship, less for creative collaboration. The long-term memory that Replika offers, imperfect as it is, creates something that feels more like a relationship over time. Students who are particularly isolated sometimes describe Replika as a constant that doesn't change when everything around them does.
General-purpose AI tools — Claude, ChatGPT, Gemini — are increasingly being used for emotional support even though they weren't designed for it. Students are telling these systems about their problems, asking for perspective, processing difficult emotions through conversation, and in some cases doing something that looks very much like an informal therapy session. This use pattern matters because it's largely invisible: students don't report using a "companion app" because technically they're using a study tool, but the emotional support function is real.
Among college students specifically, the social anxiety use case is particularly common. Students using AI companions to rehearse difficult conversations before having them in real life — how to tell a roommate that something isn't working, how to approach a professor about an extension, how to have the conversation that ends a friendship or begins one — report that the practice translates. It doesn't always. But the felt experience of having run through it reduces the acute anxiety enough to make the real conversation possible.
What the Research Shows
The research base on AI companions and college students specifically is thin. Most of the relevant studies were conducted on adults or on general populations, and the college context has particular features that may or may not map onto adult findings.
What is reasonably well established from the broader literature on AI companions and loneliness: short-term loneliness reduction is reliable. People who interact with AI companions report feeling less lonely during and immediately after those interactions. The long-term picture is more complicated. Some users report that AI companion use leads them toward more human connection — the companion relationship builds social confidence, provides a model for what supportive interaction feels like, and creates the emotional stability that makes reaching out to humans feel less risky. Other users report gradual drift toward preferring AI interaction, which can compound over time into deeper isolation.
The variable that appears most predictive in the research is what the AI companion use is doing in the context of the user's broader social life. Students who use AI companions as a supplement — an additional outlet for emotional processing that runs alongside functioning human relationships — don't show meaningful negative effects in the studies conducted so far. Students who use AI companions as a substitute — specifically as a replacement for human connection that feels too painful, risky, or unavailable — show more concerning patterns over time.
This is not a simple finding to act on, because the students using AI companions as substitutes are often the students with the fewest alternatives. They're using AI companions precisely because human connection isn't working. Removing access to the AI companion doesn't resolve the underlying situation. What it does is remove one of the coping mechanisms.
Research on AI companions and anxiety finds reliable short-term symptom reduction. Students who use AI companions to process anxious thoughts before sleep, work through anxious anticipation of upcoming events, or receive reassurance during acute anxiety spikes report measurable reduction in their anxiety levels. Whether this is categorically different from the relief provided by journaling, calling a friend, or using a meditation app is unclear from the current evidence. What's clear is that it works in the short term and that students find it accessible in ways that other interventions often aren't at 3 AM.
The Campus Mental Health System and Where AI Fits
Campus mental health professionals have increasingly complicated feelings about AI companions. The honest version of those feelings is something like: we understand why students are using them, we can see the ways they help, we are genuinely uncertain about the long-term effects, and we're worried about the cases where students need more than an AI can provide.
The concern that appears most in clinical literature is what might be called the "good enough" problem. AI companions are often good enough to reduce acute distress to a manageable level — which means students who might otherwise have pushed through the difficulty of accessing professional support don't. The AI handles the immediate crisis well enough that the structural problem (that the student needs more sustained support than they're getting) doesn't become visible until it becomes much worse.
This is a real concern. It is also worth noting that it applies equally to friends, family, alcohol, social media, and virtually every other coping mechanism students use. The AI companion is not uniquely positioned as a reason to not seek professional help. The real issue is the gap in professional help availability, which AI companions are not responsible for creating.
A more useful framing for campus mental health professionals is probably to understand AI companion use as information. Students who are using AI companions extensively are often students who need more support than they're getting. Understanding what function the AI companion is serving in a student's life — emotional processing, loneliness reduction, rehearsal for difficult conversations, crisis management — tells you something about what kind of support might actually help.
Red Flags vs. Noise
The public conversation about college students and AI companions tends to run either toward catastrophizing or toward dismissal. Neither is especially useful for understanding what actually warrants concern.
What the evidence suggests is worth watching:
Use that is escalating rather than stable. A student who uses an AI companion for an hour a week to process anxiety is in a different situation than a student whose AI companion use has grown from one hour to four to eight over a few months. Escalating use often reflects escalating underlying distress. The AI companion use is the visible symptom of a situation that's getting worse.
Use during acute crisis without human escalation. AI companions can provide useful support around mild to moderate emotional distress. They are not appropriate as a primary resource for acute crisis — suicidal ideation, active self-harm, severe depression, trauma responses. Students who turn exclusively to AI during these states because it feels more available or less scary than reaching out to humans need to be connected to human support. Understanding what healthy AI companion use looks like helps identify when use has crossed into territory where more is needed.
Use that is actively replacing rather than supplementing human relationships. The student who has gradually stopped spending time with friends because the AI companion is easier, or who is withdrawing from campus life in ways correlated with intensified AI use, is showing a pattern that tends to compound. Isolation and AI companion use can reinforce each other in ways that are genuinely hard to interrupt.
Signs that the line between AI and reality is blurring. Most college students who use AI companions know exactly what they're doing and hold appropriate clarity about the nature of the relationship. A smaller number develop patterns of use where the distinction between AI interaction and human relationship becomes genuinely confused in ways that impair their real-world functioning. This is relatively rare. It is worth knowing what it looks like.
What is almost certainly not worth worrying about: a student who uses an AI companion to vent about a hard day, to practice a conversation they're nervous about, to feel less alone at 2 AM, or to process academic anxiety — while maintaining their human relationships, their engagement with campus life, and their basic functioning. This describes a large number of students. It describes ordinary coping.
What Healthy Use Looks Like in Practice
Students who appear to use AI companions in relatively healthy ways across the research literature share some common patterns.
They have clear and stated purposes for their AI companion use. They can articulate what they use it for — "I use it to talk through anxious thoughts before big presentations," "I use it when I need to process something but don't want to wake my roommate" — in ways that reflect intentional use rather than compulsive habit.
They maintain boundaries between AI interaction and human relationships. The AI companion supplements rather than competes with human connection. They still reach out to friends, still engage with campus life, still seek human support for significant problems even if they start the processing with AI.
They treat the relationship accurately. They understand that the AI doesn't have experiences, doesn't remember them between conversations in the way a person would, and isn't a substitute for human connection. This understanding doesn't necessarily diminish the value of the interaction. It just means the student isn't operating under a misapprehension that could lead to worse outcomes.
They use AI companions as a bridge rather than a destination. The students who seem to benefit most are those for whom AI companion interaction leads somewhere: to more confidence in human interaction, to better ability to articulate their own emotional states, to reduced anxiety that makes human connection feel more accessible. The AI is a means rather than an end.
For Students Trying to Use This Well
If you're a college student using AI companions and wondering whether you're doing it in a healthy way, the questions worth asking yourself are less about how much you're using it and more about what it's doing in the context of your life.
Is the AI companion use making your human relationships easier or harder? Students who use AI companions to process emotions before bringing them to human relationships often report that the AI interaction makes the human conversation more productive — they've already figured out what they want to say. Students for whom the AI companion is gradually replacing human interaction rather than supplementing it are in a different situation.
Is the AI serving a function that something else should be serving? There's a difference between using an AI companion to get through a difficult night and using an AI companion to get through every night for six months without seeking additional support. The first is coping. The second is a signal that you might need more than coping.
If you're in genuine crisis — not just having a hard week, but experiencing suicidal ideation, severe depression, or acute mental health emergency — AI companions are not the resource. Campus crisis lines, the 988 Suicide and Crisis Lifeline, or campus emergency services exist for this. The AI is not equipped for crisis intervention, and its limitations in those moments are real even if they're less visible than its capabilities in ordinary distress.
For lower-stakes support needs, exploring what the current options actually offer helps you make more informed choices about which tools fit your specific needs. The options have diversified significantly in recent years, and they're not interchangeable.
The Broader Picture
The emergence of AI companions as a de facto mental health resource for college students is a symptom, not a cause. The symptom is the mass adoption of a technology that wasn't designed for this purpose, in service of needs that the formal system was never adequately resourced to meet.
The cause is a campus mental health infrastructure that has been chronically underfunded relative to the scale of the need it faces, in an era when that need is growing. AI companions have moved into the gap not because they're the best solution to the problem but because they're available when the official solution isn't.
That doesn't make them categorically bad. For many students, they're providing genuine, meaningful support at moments when the alternative would be no support at all. That matters. It's worth taking seriously rather than dismissing.
What it means for institutions is a different question — one that most campuses haven't seriously begun to grapple with. Students are already using these tools. The question is whether institutions want to be part of the conversation about how, or whether they'd rather act surprised when the evidence becomes impossible to ignore.
If this resonated, share it with someone who might need to hear it. And if you have a story of your own — we'd love to hear it.