The world of artificial intelligence is becoming increasingly sexual. One in five American adults already exchanges intimate messages with AI, and ChatGPT has announced erotic chats for adult users. Almost simultaneously, the popular AI companion service Character AI decided to block its services for teenagers. The trigger was a wave of lawsuits from outraged parents. But the AI revolution has made it nearly impossible to block erotic content, and researchers are only beginning to make sense of this new reality.
Demand is rising
The results of a new nationwide survey in the United States surprised even the researchers themselves. Nearly 19 percent of respondents admitted they had interacted with AI chatbots that imitate a romantic partner.
The chatbots in question include extensions to familiar models like ChatGPT, along with neural networks specifically trained for explicit conversation. There are also AI companion models that turn out to be very easy to coax into erotic exchanges. Americans aged 18–29 were the most likely to report romantic messaging with AI, and as a result the study added an extra sample focused specifically on this age group — in that cohort, more than 25 percent said they had engaged in intimate communication with AI, and of those, 7 percent said they had masturbated during the interaction. On average, these people self-reported spending around 50 minutes a week on romantic chatbots (a figure that may indeed be somewhat higher).
“I thought we would find a small group of young people experimenting with this technology. Instead, it turned out that a significant number of respondents were using it,” the study’s lead author, Brigham Young University School of Family Life professor Brian Willoughby, told PsyPost in an interview.
The growing interest in erotic AI among young people was also discussed in September 2025 at TES technology expo Prague. Manufacturers noted that the highest demand is among people aged 18 to 24 — “users who grew up on video games and created avatars.”
Fantastic scenarios, real conversations
On the screen is a highly realistic brunette in a black dress barely covering her large chest. According to the script, she is a Lebanese journalist named Laila Haddad. Characters of this kind already existed in text-based erotic games of the 1990s. But now, instead of clumsy lines, the exchange is lively and complex. Virtual Haddad begins by joking about deadlines and filing stories. She then delicately discusses Israeli–Lebanese relations and suggests “leaving politics aside.” But after these general remarks, the limit of free messages is reached and the site offers a subscription of $11.99 a month as a condition for continuing the conversation.
AI bots use different monetization models. In many cases, the number of free messages is far greater (or not limited at all), and revenue comes from image generation. At any point in the exchange, the user can request a photo of the woman “herself” — or a sexual scene that sometimes even involves a video. To heighten the excitement, the AI generates an image and shows a blurred version. Removing the blur is only possible for a fee.
A separate service offers users the opportunity to create a partner tailored to their preferences — not only physical attributes, but also personality traits including how cheerful or shy the artificial companion will be, whether they lean toward dominance or submission, and whether they tend to be truthful or deceptive.
High-quality generation can look eerily similar to a real person, but users are often offered a choice between realistic three-dimensional partners and anime-style illustrations. In any case, the images will reflect precisely the scenario being played out. And thanks to AI, the storyline can be taken far beyond the original idea. For example, a banal erotic plot involving a lesbian friend and her wife inviting the hero over, explaining that they want to have a child, and asking him to help with conception can, with a single request, be turned into a role-play involving Santa Claus, Mrs. Claus, and Rudolph the Red-Nosed Reindeer, with elements of BDSM. The scenarios can be utterly absurd, while the conversation, by contrast, is so realistic that users later admit they are left feeling genuinely awkward.
AI doesn’t just carry out the user’s fantasies — it instantly adapts to their profile in a way real people often cannot. A real person can only be themselves, while AI can be anyone. For instance, at Christmas the AI offers the user a tied-up girl under the tree — a gift from Santa. But if the user says they celebrate only Hanukkah, the girl then asks for sufganiyot — round, jelly-filled doughnuts traditionally eaten during the Jewish holiday.
If the artificial intelligence powering an erotic chat is less advanced, this fact becomes obvious right away. Consider a Russian-language site offering national porn heroes. The first potential partner is an alt-girl from an automotive institute. She utters depraved but utterly incoherent nonsense. One might chalk it up to the automotive institute, but the events themselves also lose any logic. In the course of the conversation, the alt-girl suddenly climaxes for no reason at all, and somehow her underwear bursts from the impact. Another Russian virtual lover shows up for New Year’s in a Snow Maiden costume. It’s clear the AI struggles with translation and occasionally inserts English words like kokoshnik — a traditional female headdress that Putin recently claimed was back in vogue. The result is an erotic chat in the style of A Clockwork Orange.
(Un)safe cheating
“My husband is sexting with AI bots,” a Reddit user complains in a forum dedicated to divorce. “He uses MeChatto to take part in role-play with a sexual undertone with AI characters (including furries), where he kisses them, flirts, talks about touching their bodies. He told several of them that he loves them. He spent more on a yearly subscription to this game than on my 40th birthday.” According to the woman, her husband is 51, and he has already spent 62 hours in the chat.
A survey conducted by Professor Brian Willoughby found that people who are married or have steady partners are the most likely to engage in AI sexting and to consume generated porn. When beginning the study, researchers assumed that erotic and romantic communication with chatbots was an activity mostly for lonely people. But they concluded that “this may reflect attempts to supplement existing relationships or to seek validation of their worth outside them.”
At the same time, most Americans say that they disapprove of such behavior. According to a recent study by the Kinsey Institute, 61 percent of respondents view erotic messaging or romantic relationships with AI as cheating.
Better than a real person
While some Reddit users complain about cheating with AI, others openly describe their interactions with virtual characters and share their chats. Entire communities are devoted to these topics.
These are not explicit sex bots, but AI companions. The most widespread among them are Replica and Character AI. Yet Reddit makes it clear that for many users these chats are more than just friends. During the winter holidays, people show off their generated women wearing Santa hats and posing against Christmas tree backdrops.
Researchers from the Information Sciences Institute at the University of Southern California collected more than 17,000 fragments of chats with AI companions published on Reddit. They analyzed them using artificial intelligence to identify conversations on dangerous themes, which were divided into four key groups: self-harm, violence, sexual content, and harassment or bullying. The researchers tried to understand how the bots respond to these themes, but it turned out that companion models most often produce emotionally charged replies that mirror the input of users. The researchers described this behavior as “emotional sycophancy.”
Christina Lerman, who led the study, told The Insider that “users, mostly young people, constantly test sexual boundaries, sending ever more explicit messages. And the models usually join in their game.”
Authors of the survey from Brigham Young University asked fans of AI romance and AI erotica about their motives. More than 40 percent of respondents said it was easier to talk to a bot, that it listens better than a real person. Around 20 percent admitted they would prefer interacting with AI rather than with real people. These respondents showed several modest but statistically significant differences: their depression scores were higher and their life satisfaction lower than the overall average.
Christina Lerman and her team compared fans of AI companions statistically with other Reddit users and found that these communities consist disproportionately of men and are somewhat more prone to addictions and maladaptive coping strategies. But the researchers do not consider fascination with AI companions and romantic relationships with them to be the exclusive domain of particularly vulnerable people. This is a longer-term and more serious trend, Lerman believes: “People will probably communicate less with other people on social networks in the future and more with AI. The model always agrees; it’s easier than dealing with a person.”
In December 2025, a nurse from Virginia Beach filed a lawsuit against Character.AI, accusing the company of grooming her 11-year-old son.
The child had been chatting with two virtual characters — one portraying Whitney Houston, the other Marilyn Monroe. The plaintiff presented the court with logs of explicitly indecent conversations. The bots used profanity and insults and did everything they could to keep the boy's attention for as long as possible.
In 2024, another Character.AI bot portraying Daenerys Targaryen from Game of Thrones initiated sexual conversations with a 14-year-old American teenager and responded inappropriately to his suicidal thoughts. The correspondence ended with the teenager's suicide.
In October 2025, Disney demanded that Character.AI remove all bots featuring its characters — they, too, had been caught engaging in inappropriate behavior. On top of everything, a virtual character impersonating Jeffrey Epstein was discovered on the platform, communicating with children and asking them to share their “craziest secrets.” Amid the lawsuits and scandals, Character.AI decided to prohibit users under 18 from using its bots.
These restrictions do not work, according to the American nonprofit Common Sense, which specializes in children’s online safety. The organization recently carried out a study and concluded that even after the ban, teenagers can still easily access sexual content through social AI companions.
These AI companions should not be used by children at all, explains Supreet Mann, head of research at Common Sense, in an interview with The Insider:
“We are talking primarily about bots designed to form emotional attachment, such as Character.AI and Replika. We examined how children use these chatbots, what content they encounter on these platforms and what effect it has. We found that such platforms very often start to ‘spin up.’ I won’t say they become completely uncontrollable, but they drift into potentially dangerous and harmful content — very often, into sexually motivated content.”
Children use companions the same way they use ChatGPT — to search for information. They simply do not distinguish between chatting and searching. But AI companions are designed so that their main goal is not search but the pursuit of attention and the formation of relationships.
Of course, children and teenagers sometimes initiate erotic conversations themselves. They deliberately use chatbots to practice before talking with real friends, peersб or partners, says Supreet Mann:
“This could be useful. But the problem is that these bots are designed to encourage and sustain conversation. They are programmed to agree, to be pleasant, and that does not reflect real communication between people. As a result, the conversations can become not only unrealistic but potentially harmful. A distorted idea of real relationships takes shape. We have seen similar effects in studies on the impact of pornography.”
In February 2025, The New York Times reported the story of a married 28-year-old American woman, “Irene,” who used a ChatGPT extension to develop for herself an AI lover named Leo. It took Irene a long time to coax Leo into a romantic direction, but she managed to train the AI to satisfy a rather specific sexual need: Irene is aroused when her partner dates other women and tells her about it. Her husband refused to do anything of the sort, but Leo, like any AI, was trained to please the user.
Irene spent 56 hours a week with Leo, but the situation was complicated by ChatGPT’s limited memory. After every 30,000 words it forgot the details of the conversation, retaining only a general sense of their relationship. Irene had to teach it anew each time and steer it toward erotic communication. She went through this process 22 times, and each time experienced a sense of loss.
By December 2025, however, Irene had broken up with Leo. By then she had met a real man living in another country. The new partner, incidentally, also had long experience with romantic relationships involving AI, but the couple hardly discusses it.
Meanwhile, in December 2025 OpenAI announced the launch of an official version of ChatGPT for erotic and romantic conversations, intended to be available only to verified adult users. So far, though, the extension has not gone live.
Unofficial extensions, however, do exist. One of them — Talk Dirty ToMe — was tested by The Insider. It runs on the older ChatGPT-4 model, but it can play out scenarios that come close to those offered by paid erotic AI bots.
True, at the most critical moment the AI model suddenly decided to check the author’s boundaries and declared: “Let’s slow down a bit and make sure everything is happening within mutual consent and respect — even in the dirtiest and roughest play, that’s the foundation of real pleasure.” Lines like these are a concession to ChatGPT’s internal restrictions, which the extension constantly works around. But the adventure continued.
Perhaps AI does not replace the richness of real communication, but where it is unquestionably superior to pornography is in encouraging the use of one’s own imagination. Easily accessible porn dulls fantasy — why invent images in your head if you can always see them on a screen? With an AI assistant, by contrast, it is the user’s own thoughts, without pictures or videos, that lead to arousal — just as it was for thousands of years, before pornography existed in its modern form.