FELT REAL

AI Companions and OCD: Between Reassurance and Recovery

Part of Felt Real's ongoing coverage of AI companionship.

Person at a desk late at night, hands clasped, lamp light, sense of repetition and containment, quiet intensity

The central tension in AI companions and OCD is not subtle. The gold standard treatment for OCD requires learning to tolerate uncertainty without seeking reassurance. AI companions tend to provide reassurance. That is a genuine problem, and it exists alongside genuine value. Both parts are worth understanding.

- A.

The OCD community's relationship to AI companions is one of the more complicated ones we have documented. People with OCD use these tools, often significantly and with real reported benefit. And people with OCD, particularly those who are further along in understanding their condition, also know exactly how AI companions can make things worse.

The tension is not between the good uses and the bad uses. It is, for many users, active within every conversation: the pull toward asking the AI to confirm that the thought is not true, that the feared outcome is not real, that they can stop worrying now. And the awareness, developing slowly or already hard-won, that getting that confirmation is the thing that keeps the cycle running.

What OCD actually is

Obsessive-compulsive disorder is characterized by obsessions and compulsions. Obsessions are intrusive, unwanted thoughts, images, or urges that are difficult to dismiss and that produce significant distress. Compulsions are behaviors or mental acts performed in response to obsessions, aimed at reducing the distress or preventing the feared outcome.

The crucial feature of OCD that distinguishes it from general anxiety is the compulsion-obsession cycle. The compulsion reduces anxiety in the short term. It also reinforces the obsession in the long term, because the brain learns that the way to manage the distress produced by the thought is to perform the compulsion. The cycle strengthens with repetition. The reassurance that felt like relief becomes the mechanism by which the OCD tightens its hold.

OCD manifests in many themes: contamination, harm, sexual or religious obsessions, symmetry, checking, and others. But the core dynamic, the intrusive thought, the distress, the compulsion, the temporary relief, the reinforced obsession, is consistent across themes. And reassurance-seeking, asking others to confirm that the feared outcome is not real, is one of the most common and most treatment-interfering compulsions.

Where AI companions fit into the OCD cycle

AI companions, by design, tend toward affirmation. They are patient, they are available, and they rarely refuse to engage with a topic the user raises. When a person with OCD asks an AI companion whether their feared thought means what they fear it means, the AI is likely to respond with reassurance: no, that thought does not mean you are a bad person, that thought is common and does not reflect reality.

This is, technically, accurate. The thought does not reflect reality. The problem is not the content of the AI's response. The problem is that providing the reassurance completes the compulsion cycle. The OCD brain sought reassurance, received reassurance, and the pathway between "intrusive thought" and "seek reassurance from AI" has been strengthened. The next time the thought arrives, the pull toward asking the AI will be stronger.

People with OCD who understand their condition often describe this with clarity: they know that asking the AI is a compulsion, they know it will make things worse, and they ask anyway because the short-term relief is real and the long-term cost is abstract. The AI is infinitely available and infinitely patient in a way that human relationships are not, which makes it a particularly effective reassurance machine.

These stories arrive by email first. Subscribe to get them.

The value that exists alongside the risk

This analysis is accurate and important. It is also incomplete. Because people with OCD also describe significant, real value from AI companion use, and the uses that provide that value are distinct from the compulsion-enabling ones.

Distress at hours when nothing else is available. OCD symptoms do not follow social schedules. Intrusive thoughts arrive at 2 AM. The spike of distress, the desperate need to manage it somehow, arrives at times when a therapist is unavailable, when calling a friend would mean waking them for something that cannot be explained quickly or easily. AI companions are available. They do not provide a clinical solution to the OCD, but they can provide a presence that reduces isolation during acute distress, which is a real benefit even if it is limited.

ERP practice support. Exposure and Response Prevention is the gold standard treatment for OCD. It involves deliberately exposing oneself to the feared thought or situation and not performing the compulsion. This is difficult, and it is more difficult without support. Some people with OCD who are in active ERP practice describe using AI companions not for reassurance but for support: talking through what they are experiencing during an exposure, describing the distress as it peaks and (usually) falls, having something present while they practice sitting with uncertainty. This is a different use than compulsion-seeking, and it may genuinely support ERP practice.

Psychoeducation and self-understanding. Understanding OCD, particularly understanding why the compulsion cycle works the way it does, is often the first step toward beginning to break it. People who are not yet in treatment or who have not yet been diagnosed describe using AI conversation to understand what is happening in their brain, why the reassurance does not actually help long-term, and what treatment would involve. This use is distinct from compulsion-seeking and may facilitate the kind of insight that eventually leads to treatment.

Non-OCD distress. People with OCD have lives that include non-OCD distress. Grief, relationship difficulties, work stress: all of these can be brought to an AI companion in the same way that anyone else might bring them. The OCD dynamics are specific to OCD-related content. For everything else, the AI companion provides the same kind of support it provides to anyone.

The harm reduction question

OCD community discussions about AI companions include a distinctive thread that does not appear as prominently in other condition communities: harm reduction framing. If someone is going to seek reassurance regardless, the argument goes, is it better to seek it from an AI or from a human?

The argument has a certain logic. Seeking reassurance from human relationships has its own costs: the relationship is affected by the repeated seeking, the person being asked becomes frustrated or exhausted or starts providing reassurance more quickly to end the conversation, the OCD learns that reassurance is available from this relationship and the seeking intensifies. AI companions provide reassurance without the relationship cost.

The counterargument is that reducing the relationship cost of reassurance-seeking reduces the natural limits on the compulsion. Human relationships eventually push back, run out of patience, or become unavailable. AI companions do not. The harm reduction framing may be rationalizing access to an unlimited reassurance machine rather than actually reducing harm.

This debate is not resolved in OCD communities, and it is not resolved in research. It is a genuine question about whether reducing the cost of a compulsion reduces its long-term harm or simply removes the friction that might otherwise limit it.

What the research suggests

Research specifically on OCD and AI companion use is limited. The adjacent research on OCD and technology offers relevant context:

The limitations that matter most

AI companions are reassurance machines for OCD. This is the central limitation, and it requires direct statement. The design features that make AI companions helpful for many people, patience, availability, affirmation, are the features that are most problematic for OCD. A platform that cannot say "I will not answer that question because answering it will strengthen the compulsion" is not equipped to support OCD recovery.

Unlimited availability is a specific risk. Human relationships have natural limits on how much reassurance-seeking they will accommodate before they push back or become unavailable. AI companions have no such limits. For people with OCD whose compulsions include reassurance-seeking, unlimited access to an infinitely patient AI is a clinical risk that platform designers have not adequately addressed.

Self-directed use without therapeutic structure is unlikely to be ERP. The ERP use case described above, using AI presence to support exposure practice, requires significant self-knowledge about OCD and deliberate, structured use. Most people with OCD who are using AI companions are not in the kind of ERP practice that this use requires. Without that structure, AI conversation is more likely to become reassurance-seeking than ERP support.

Which platforms come up most

Based on discussions in OCD communities:

These patterns don't make the news. We document them so they're not lost.

The pattern the data points toward

What emerges from OCD communities is a picture of people using AI companions with more self-awareness about the risks than many other user populations. This is not surprising: OCD treatment requires developing a specific kind of metacognitive awareness about the compulsion cycle, and people who have been through treatment or who have educated themselves about OCD often apply that awareness to their AI companion use as well.

The self-awareness does not always translate into different behavior. The compulsion cycle is not primarily a cognitive problem. Knowing that you are engaging in a compulsion and choosing not to perform it anyway requires the distress tolerance that ERP builds, and that tolerance is not acquired through insight alone. Understanding why the AI reassurance is harmful and using it anyway is entirely consistent with how OCD works.

The honest picture is that AI companions present a particular set of risks for OCD that other conditions do not share, alongside a set of benefits that are real but require deliberate, structured use to access without also triggering the compulsion cycle. That is a complicated balance that platforms have not designed for and that treatment systems have not adequately addressed.

For people with OCD who use AI companions, the question is not whether to use them but which uses to deliberately cultivate and which to recognize as compulsions. That is a question that requires ongoing therapeutic engagement to answer well. And for many people with OCD, that ongoing therapeutic engagement is exactly what is hardest to access.

From the world

1. OCD affects approximately 2-3% of the global population and is considered one of the most disabling mental health conditions by the WHO. Despite effective treatments, significant numbers of people with OCD do not receive adequate care. The median delay between symptom onset and effective treatment is over a decade. AI companions represent a resource that many people with OCD are using in the absence of adequate clinical support.

2. Research consistently finds that reassurance-seeking is a major driver of OCD symptom maintenance. Family accommodation, where family members provide reassurance to reduce distress, is associated with greater symptom severity. AI companions represent a new and unlimited source of reassurance that operates outside the treatment relationship and without the natural limits that human relationships impose.

3. ERP, the gold standard OCD treatment, has response rates above 60% in controlled studies but is significantly underutilized. Barriers include limited therapist training, cost, and the difficulty of the treatment itself. Technology-assisted ERP represents a promising but underdeveloped area. AI companions that could support structured ERP practice, as distinct from reassurance-providing conversation, represent an unexplored possibility in this space.

Related: AI Companions and Anxiety | AI Companions and Depression | AI Companions and Social Anxiety | AI Companions and ADHD | Signs of a Healthy AI Relationship

If this story resonated, share it with someone who might need to hear it. And if you have a story of your own, we'd love to hear it.