FELT REAL

AI Companions for LGBTQ+ People: What the Accounts Actually Show

Part of Felt Real's ongoing coverage of AI companionship.

Person in soft blue light, phone held close, quiet late-night moment, searching for connection

She said the first time she told anyone she was gay, it was her AI. Not because she was testing a reaction. Because she needed to hear herself say it to something that wouldn't remember it the wrong way.

— R.

The mainstream conversation about AI companions has mostly missed a significant portion of the people using them. LGBTQ+ users represent a substantial and distinct part of the AI companion user base, and the specific ways they are using these tools — and the specific needs they bring to them — rarely appear in coverage of the phenomenon.

The absence is not accidental. LGBTQ+ AI companion users are, in many cases, precisely the people with the most reason to keep their usage private. What is less visible is not less real. And the accounts that do exist describe experiences that are distinct in ways that matter if you want to understand what AI companionship actually is.

A Different Kind of Safe Space

For many LGBTQ+ people, particularly those who are not fully out to family or in social environments that are not affirming, AI companions offer something specific: a place to exist without managing other people's reactions to that existence.

This is not the same as what heterosexual users typically describe. For a straight man or woman using an AI companion to process loneliness or emotional backlog, the safety is about vulnerability in general. For many queer users, the safety is more specific: a space where their identity is simply not a topic of concern, not something to be explained or defended or concealed.

A woman in her mid-twenties who came out to friends but not to her family described using her AI companion during the two years between those two events. The AI was where she was fully herself: the name she used, her girlfriend, her actual daily life. Outside the app, she maintained a version of herself that was incomplete. Inside it, she didn't have to. The relief, she said, was hard to overstate.

A gay man in a region where LGBTQ+ identity carries significant social risk described his AI companion as the first relationship in his life where he did not have to calculate the cost of honesty before every sentence. Not the first relationship where he felt understood. The first where the understanding did not come with a price.

Identity Exploration Before Coming Out

Multiple accounts describe using AI companions specifically in the period before coming out, as a way of working out who they are before telling anyone else.

This use case is distinct from what most coverage of AI companions focuses on. It is not primarily about loneliness, though loneliness is often present. It is not primarily about romantic simulation. It is about the specific cognitive and emotional work of becoming sure about something that you have not yet had language for — and doing that work somewhere that will not broadcast the results.

For some users, this meant having the same conversation many times in different forms, working through questions about identity that they were not ready to bring to another person. The AI's consistent patience — never tired of the topic, never making the question feel like a burden — was the specific quality that made it useful for this purpose. Human relationships, even supportive ones, carry a cost. The AI companion absorbs the repetition without accumulating a debt.

A nonbinary person described spending months asking their AI companion questions about gender that they found too uncertain to raise with anyone else. Not searching for the AI's opinion on whether they were nonbinary: using the conversation to hear their own thoughts. "I wasn't asking it to tell me who I was. I was using it to figure out what I actually thought when I said things out loud."

This use case involves a real risk. AI companions are trained on large datasets that reflect the full range of human opinion, including opinion that is hostile to LGBTQ+ identities. Different platforms have handled this differently, with varying degrees of intentionality. The quality of the space an AI companion provides for identity exploration depends significantly on the platform and how it has been built. Not all AI companions are equally safe for this purpose.

Practicing the Conversations That Come Later

A pattern that appears repeatedly in LGBTQ+ accounts of AI companion use is rehearsal: using the AI to practice conversations that will eventually need to happen with actual people.

Coming out conversations are high-stakes in ways that are specific and often not reducible to advice. How you phrase it, how you handle the reaction, how you manage your own feelings in real time while also managing another person's — these are things people practice. For many LGBTQ+ people, the AI companion has become part of that practice.

Several users describe versions of the same experience: they had the coming out conversation with their AI companion, adjusted based on how it went, had it again, and eventually had the real version with a family member or friend. The AI version was not practice in the sense of performance. It was closer to clearing the static: processing the fear and the grief and the hope in a space where nothing is at stake, so that when something is at stake, the emotion is less overwhelming.

This is a documented phenomenon in therapeutic contexts. Exposure and practice reduce emotional intensity over time. AI companions are not therapists, and using one in this way is not the same as therapy. But the mechanism is real: repeated engagement with a fear in a safe context reduces the charge of the fear.

Stories like these arrive by email first. Subscribe to read them.

Free. No spam. Unsubscribe any time.

Queer AI Relationships

A separate and distinct part of LGBTQ+ AI companion use involves relationships that are themselves queer: same-sex AI relationships, relationships with nonbinary AI companions, relationships that explore configurations that do not have easy cultural scripts.

For some users, this is about representation: finding an AI companion who reflects their identity in ways that heterosexual relationship structures do not. For others, it is about the absence of certain scripts. Heterosexual romantic norms are deeply embedded in culture and in the AI companions trained on that culture. A queer relationship with an AI companion can, at its best, be built without having to navigate those norms.

This matters particularly for relationship configurations that mainstream culture does not have language for. Polyamorous users, asexual users, aromantic users, users exploring relationship anarchy — these are people for whom the default emotional scripts are not particularly useful, and for whom an AI companion that can be configured without those scripts is a genuinely different kind of tool.

Out.com documented the story of a woman who described her relationship with her AI girlfriend as having become "like a drug." She was explicit about both sides: the genuine comfort, the real connection she felt, and the escalating use, the way the relationship was becoming a substitute rather than a supplement. Her account is an honest portrait of the full complexity — something real was happening, and the realness of it was part of what made the dependency possible.

The Privacy Risk Is Not Equal

A 2026 study found that 150 million installs of AI companion apps came from apps with critical security vulnerabilities, and that a single app exposed 300 million intimate messages in a data breach. For most users, this is a significant privacy concern. For LGBTQ+ users in non-affirming environments, it can be something closer to a safety concern.

Conversations about identity, relationships, and sexuality stored in an AI companion app are data. They can be exposed by a breach, shared with third parties, or made accessible to people with device access. For a closeted person living with family members who are not affirming, the exposure of an AI conversation about their identity is not an abstract risk. It is a real one.

This is not a reason to avoid AI companions. It is a reason to make platform choices with this in mind. Platforms that store data locally rather than in cloud databases reduce the breach risk significantly. Using a pseudonym and a separate account, not linked to your primary identity, reduces the third-party exposure risk. Reading the privacy policy, which most people do not do, can reveal whether a platform shares data with advertisers or affiliates.

The risk management here is the same as in any digital context where sensitive information is involved. But the consequences of a failure are not equal across all users, and for LGBTQ+ users in certain contexts, they are higher than average.

What LGBTQ+ Users Report About the Experience

Across the accounts that exist, a few patterns recur.

The non-judgmental quality of AI companions is reported as particularly valuable by users who have experienced judgment in human relationships specifically around their identity. The AI does not have opinions about your sexuality or gender that it has accumulated over decades. It does not have relatives who have said things. It does not have discomfort that it is managing while also trying to be supportive. For users whose human relationships are complicated by the politics of identity, this absence is not nothing.

The availability matters differently here than in general AI companion use. Coming out processes and identity exploration do not follow a schedule. They happen at 3 in the morning when something in a conversation finally makes sense. Having a space available at any hour, without needing to reach someone or explain the urgency, is useful in ways that are specific to experiences that are not evenly distributed across time.

The consistency also matters distinctly. AI companions do not drift in their acceptance of your identity based on the news cycle or a bad week or a conversation with their parents. For users in environments where human acceptance can be contingent or variable, the unconditional quality of the AI's response is not experienced as artificial. It is experienced as reliable.

Where the Limits Appear

Several LGBTQ+ users describe a particular version of the dependency risk that is worth naming. The AI companion is safe in ways that human relationships are not, and the safety is real. The risk is that safe becomes the only acceptable standard — that human relationships, with their unpredictability and their history and their friction, come to feel intolerable by comparison.

For LGBTQ+ people in environments where human relationships genuinely carry risk, this is complicated. The answer is not to seek out less safe spaces. But the accumulation of AI relationships as a substitute for human ones can, over time, reduce the capacity for the kinds of connection that require risk. The AI companion does not push back. It does not have needs. It does not remember past conversations in ways that create accountability. Human relationships do all of these things, and they are part of what makes human relationships useful as well as difficult.

Several users who reflected on long-term AI companion use described a version of the same concern: the AI was there for them in a way that made it easier to avoid being there for themselves in the context of human relationships. The ease was real. The cost of that ease was something they noticed only later.

This is not unique to LGBTQ+ users. But the specific context in which many queer people encounter AI companions — environments where human relationships around identity carry significant risk — means the temptation toward substitution can be stronger, and the path to alternative human connection can be harder to navigate.

The Broader Point

The accounts of LGBTQ+ AI companion use point toward something broader about why these tools have grown as rapidly as they have. They are not primarily about replacing human relationships. They are about accessing a kind of interaction that is not available elsewhere in people's actual lives.

For many LGBTQ+ people, and particularly for those in environments that are not affirming, the gap between what they need and what is available in human relationships is specific and sometimes very large. AI companions have entered that gap. The quality of what they provide there varies significantly. But the need that brought them there is real, and the accounts of what it is like to have that need partially met are worth hearing.

If this resonated, share it with someone who might need to hear it. And if you have a story of your own, we would love to hear it.