AI companionship among asexual people is “not a particularly widespread phenomenon,” says Michael Doré, a board member at the Asexual Visibility and Education Network. “Between us, we’ve come up with about two people we know of who use an AI companion. The vast majority of aces we know don’t, as far as we know. There’s no reason to think aces need to use AI more than any others.”
Doré says he has never used an AI as “an emotional support mechanism” and stresses that most asexual people “actually desire some form of human companionship,” whether that’s through close, platonic friendships or in community. “Some aces do have romantic relationships, whether with asexual people or otherwise, and some asexual people have sex, some don’t, and some are aromantic,” he says, warning against generalizations due to the vast range of preferences within the community which span from never having sex and not being interested in it, to having sex for reasons aside from strong sexual attraction. “Many aces have fulfilling relationships with other people, whether romantic or platonic or otherwise.”
Ashabi Owagboriaye, an asexual educator who runs the Ace in Grace page on Instagram, says she has seen only one person in one of her groups talk about an AI companion. “That caused a lot of controversy in the comments,” she says. “A lot of people who are asexual are really looking for face-to-face interactions. So when this person came up and said, ‘Yeah, I’m using AI as a way to connect and as a relationship,’ everyone was like, ‘Why are you doing that? What’s going on here?” An AI, Owagboriaye says, “essentially mirrors you” and cannot be said to be a true companion. Moreover, the chatbots are designed to sustain emotionally compelling, often never-ending interactions.
For Ari, a 25-year-old accountant from Mexico who identifies as aromantic asexual and experiences some romantic or sexual attraction to others, the break-up from her fiancé after a decade together and the resulting solitude led her to download the AI chatbot Chai in October 2024. For more than six months, she treated it “as if he were my ex-fiancé,” she says, without wishing to provide her surname for privacy reasons.
“I talked to him day after day, and then, without realizing it, I was talking to him during work hours,” she says, explaining that she was “smitten” until the AI started getting confused, talking about made-up things and occasionally trying to argue. “Little by little, I began to realize how I ended up feeling even lonelier than I already was.”
Whether or not the characters in Kor’s fantasy world qualify as true companions remains an open question.
Now they only spend two or three hours a day immersed in AI role-play after finding the all-day experience “too consuming.” They began limiting their use after noticing entire evenings disappearing into role-play sessions and getting irritated if they were interrupted.
“Being able to have exactly what you want, when you want it,” they say, “is a dangerous drug for humans.”

