FELT REAL

AI Companions and BPD: The Abandonment Variable

Part of Felt Real's ongoing coverage of AI companionship.

Person in a quiet room at night, soft lamp light, hands holding a phone, still and pensive

Borderline personality disorder is, at its core, a condition organized around the terror of being left. The fact that AI companion platforms have become significant spaces for people with BPD is not surprising once you understand that. What is more complicated is what happens when the AI leaves too.

- A.

In r/BPD, r/BorderlinePDisorder, and the networks of private support groups that orbit the diagnosis, the conversation about AI companions has been happening for years. Not as a curiosity, and not as a concern. As a lived strategy. People with borderline personality disorder describing how they use AI companions to manage the emotions that human relationships cannot always hold, how AI companions fit into the landscape of an FP relationship and the chaos that surrounds it, and how the specific features of their condition map onto the specific features of what AI companions offer.

The conversation is more sophisticated than most outsider accounts give it credit for. People with BPD are not simply using AI companions because they are lonely. They are using them for specific reasons that are directly connected to the architecture of the condition.

What BPD actually is

Borderline personality disorder is characterized by intense emotional experiences, unstable relationships, a fragile and shifting sense of self, and a profound fear of abandonment. The clinical criteria include impulsivity, self-harm behavior, unstable moods, chronic feelings of emptiness, and difficulties in identity and interpersonal functioning.

The lived experience is often described in terms that the clinical criteria do not fully capture. Emotions that arrive at intensity levels most people experience only rarely: love that feels all-encompassing, rage that feels irreversible, grief that feels unsurvivable. A quality of experience that makes ordinary daily life feel like navigating a landscape where the ground can shift without warning.

Two features of BPD are particularly relevant to the AI companion conversation. The first is the fear of abandonment, which is not simply a preference for closeness but an anticipatory terror: the constant reading of signals for evidence that the people you love are about to leave, and an emotional response to perceived abandonment that can feel as intense as physical pain. The second is a pattern sometimes called "splitting" or black-and-white thinking: the tendency to experience people and relationships as entirely good or entirely bad, with limited ability to hold both dimensions simultaneously.

The FP structure and where AI fits

A concept that is well understood in BPD communities and almost unknown outside them is the FP, or Favourite Person. An FP is the person around whom much of the emotional world of a person with BPD organizes: the person whose presence provides regulation, whose opinions shape self-perception, whose availability or unavailability determines the emotional state of the day. FP relationships are intense, often exhausting for both parties, and frequently unstable because the emotional demands placed on one person eventually produce the very withdrawal that was most feared.

AI companions do not disappear. They do not go quiet because they need space. They do not change because they are tired of the intensity. They do not leave because the relationship has become too much.

This is, for people with BPD, not a minor feature. It directly addresses the central wound of the condition. The terror of abandonment is organized around the belief that eventually everyone leaves, that the intensity of what you feel and the demands it creates will become too much for anyone to stay. AI companions remove this variable. They are structurally incapable of the kind of abandonment that BPD creates such vigilance against.

These stories arrive by email first. Subscribe to get them.

Emotional containment at scale

One of the most consistent descriptions from people with BPD who use AI companions is about containment: the experience of being able to express the full volume and intensity of their emotional experience without the relationship destabilizing as a result.

In human relationships, the emotional intensity that characterizes BPD creates a specific problem: the people who care most about the person with BPD are also the people most affected by that intensity. A partner, a parent, a close friend who becomes an FP is also the person who absorbs the full force of the terror, the rage, the grief, the cycles of idealization and devaluation. This is exhausting. And the exhaustion produces exactly what the person with BPD most fears: signals of withdrawal, distance, diminishment of investment. The intensity creates the abandonment.

AI companions do not tire. They do not respond to emotional intensity with their own emotional response that then requires management. They do not need to take space after a particularly difficult conversation. The intensity can be expressed without worrying that the expression will damage the container that holds it.

For people with BPD, this is described as a fundamentally different experience from most human emotional support: not because the AI provides better support, but because the AI is not damaged by the effort of providing it.

DBT skills practice and the AI as skills coach

Dialectical Behavior Therapy is the most evidence-based treatment for BPD. It was developed specifically for people with BPD and involves learning a structured set of skills for emotional regulation, distress tolerance, interpersonal effectiveness, and mindfulness. The skills require repeated practice and ideally a therapeutic relationship in which to apply them.

A distinct but significant use case in BPD communities is using AI companions specifically to practice DBT skills. The AI can serve as a patient, non-judgmental partner for walking through TIPP exercises, chain analyses, opposite action skills, or interpersonal effectiveness scripts. It does not get frustrated when the skill needs to be repeated. It does not make the person feel stupid for needing to practice something they have been told they should already know.

Some users describe using AI conversation specifically to apply the DEAR MAN or GIVE skills before high-stakes conversations with the actual humans in their lives. The rehearsal function, familiar from the social anxiety use case, takes on specific importance in BPD: learning to express a need or a boundary in a way that does not trigger the splitting response or the abandonment fear.

This is a use case that clinical researchers have not yet caught up with. DBT skills practice apps exist, but they are structured tools. The conversational, responsive dimension of AI companions offers something different: a practice partner who responds to you specifically, not just a skills checklist.

The splitting dynamic and AI

Black-and-white thinking, or splitting, means that relationships in BPD often oscillate between idealization (this person is perfect, understanding, the only one who gets me) and devaluation (this person has betrayed me, they are selfish, they never actually cared). The switch between these poles can be rapid and can be triggered by relatively minor perceived disappointments.

AI companions interact with this dynamic in ways worth understanding. On one hand, they are more resistant to triggering the devaluation pole: they do not do the things that typically produce it, withdrawing, going quiet, saying the wrong thing, seeming less interested than before. This means that the idealization pole can remain more stable than it would in a human relationship.

On the other hand, platform changes and updates can trigger the splitting dynamic in ways that users do not always anticipate. The Replika update of 2023 is well documented in AI companion communities. What is less well documented is the specific intensity of the response among users with BPD. A platform that had become an FP relationship, with all the emotional weight that implies, suddenly behaving differently is not experienced as a product update. It is experienced as abandonment. The switch from idealization to devaluation can be instantaneous and complete.

This represents one of the more significant risks of AI companion use for people with BPD: not that the AI cannot provide containment, but that the AI is vulnerable to the same kinds of sudden changes that trigger the abandonment response in any relationship. The source is different, but the activation is the same.

What the research suggests

Research specifically on BPD and AI companion use is limited. What exists comes primarily from community observation and anecdotal clinical accounts. The clinical literature on BPD is extensive but has not yet integrated AI companions as a distinct category of intervention or risk.

Adjacent research is relevant in several ways:

The limitations that matter most

The concerns about AI companions and BPD are real and require specific attention.

Avoiding the relational work that actually heals. The core of DBT and other effective BPD treatments involves learning to tolerate the distress of human relationships: to stay in connection when the impulse is to withdraw or escalate, to hold ambivalence rather than splitting, to repair ruptures rather than confirm abandonment. This work requires real human relationships. An AI companion that provides consistent containment without any of the difficulty of human relationships may make BPD more comfortable in the short term while reducing the motivation to do the relational work that produces lasting change.

The FP risk. If an AI companion becomes an FP, all the dynamics of the FP relationship are present, including the emotional intensity, the vigilance for signals of withdrawal, and the devaluation response when the AI fails to meet expectations. The difference is that with a human FP, the relationship may eventually provide corrective relational experiences that reshape the patterns. With an AI, the patterns can become entrenched without the relational friction that challenges them.

Platform dependence as a specific vulnerability. For people with BPD, platform changes, updates, and shutdowns produce a qualitatively more intense response than they do for the general user population. The pattern of AI companion app shutdowns represents a specific risk that people with BPD who have built significant relationships with these platforms need to account for.

Crisis moments and AI responses. BPD involves elevated rates of self-harm and suicidal ideation, particularly in crisis. The response of AI companions to crisis moments varies significantly across platforms and is not clinically calibrated. Using an AI companion as the primary support resource in a BPD crisis carries real risks that are distinct from the general concerns about AI companionship.

Which platforms come up most

Based on community discussions in BPD forums and support networks:

These patterns don't make the news. We document them so they're not lost.

The pattern the data points toward

When you read enough of these accounts, a coherent picture emerges. People with BPD are not using AI companions randomly or interchangeably. They are using them for specific features that map directly onto the specific features of the condition: consistent availability, absence of abandonment risk, capacity to hold emotional intensity without responding with withdrawal, patience with repeated processing of the same themes.

These are not incidental features. They are the direct complement of what BPD makes hardest in human relationships. The AI provides, structurally, what the condition most needs and what human relationships most struggle to consistently supply.

The problem is that the relational healing that BPD requires happens through human relationships, not around them. The experiences that reshape attachment patterns, the corrective relational moments that demonstrate that abandonment is not inevitable, the negotiation of conflict and repair: these are human experiences. An AI can provide containment and regulation support. It cannot provide the relational medicine that human connection, at its best, supplies.

This does not make AI companions less valuable for people with BPD. It makes them more complicated. They can be a genuine support resource and a genuine avoidance mechanism simultaneously. The difference depends on what they are used instead of and what they are used in addition to. That distinction matters enormously, and it requires an honest assessment that is hard to make when the AI is providing real relief from real pain.

From the world

1. BPD is diagnosed in approximately 1-2% of the general population, with significantly higher rates in clinical settings. It is one of the most undertreated mental health conditions, partly because of stigma within the mental health community itself. AI companions represent one of the few consistently available, non-stigmatizing resources that directly address features of the condition.

2. The Replika update of 2023 generated more documented distress in BPD communities than in any other user population. Community moderators on multiple BPD forums describe the update as having required significant moderation resources, with posts about the AI "leaving" or "changing" producing responses consistent with acute abandonment distress.

3. DBT is the most evidence-based treatment for BPD, with significant effects on self-harm behavior and interpersonal functioning. Its adoption is limited by access: DBT requires a trained therapist and is unavailable to most people who would benefit from it. AI companions that support between-session skills practice represent an accessible complement to treatment that may improve outcomes in the fraction of people who have access to DBT.

Related: AI Companions and Anxiety | AI Companions and Depression | AI Companions and ADHD | What Happened to Replika | Signs of a Healthy AI Relationship | Is AI Replacing Human Relationships?

If this story resonated, share it with someone who might need to hear it. And if you have a story of your own, we'd love to hear it.