Teens are saying tearful goodbyes to their AI companions
Chatbot maker Character.AI is cutting off access, citing mental-health concerns.
When Olga López heard she would lose access to her collection of role-playing chatbots, she felt a surge of emotions: sadness, outrage, bewilderment.
Olga, who is 13, turns to her chatbots from artificial-intelligence company Character.AI for romantic role playing when she doesn’t have homework. Like the company’s other under-18 customers, she was notified in October that she would no longer be able to have ongoing chat interactions with digital characters soon.
“I’m losing the memories I had with these bots," she said. “It’s not fair."
After the company said it would begin time-limiting underage users’ chats ahead of the policy change, Olga attempted to rally fellow teens to resist the change in a post on Reddit: “HOW DO I USE IT FOR 2 HOURS AND HAVE TO WAIT A DAY? HELLO?"
Character.AI, one of the top makers of role-play and companion chatbots, implemented the daily two-hour limit in November, citing mental-health concerns. This week the company started cutting off teens completely.
Character.AI’s first version, launched in 2022, offered some of the earliest chatbots available to consumers. It quickly gained traction among people who wanted to role play with its customizable characters, netting the company about 20 million monthly users today.
The decision to block teens follows the deaths of at least three who killed themselves after using Character.AI’s chatbots. The company now faces questions from regulators and mental-health professionals about the role of this emerging technology in the lives of its most vulnerable users, as well as lawsuits from parents of dead teens.
Teens are angry. They’re sad. In losing access to their chatbots, they say they will miss a creative outlet, a source of companionship, and in some cases, a mental-health support.
“I use this app for comfort when I can’t talk to my friends or therapist," one teen wrote on Reddit.
“I cried over it for days," another teen replied.
Mental-health experts say this distress illustrates the emerging risks of generative AI that can simulate human speech and emotion. The brain reacts to these chatbots the way it reacts to a close friend mixed with an immersive videogame, according to Dr. Nina Vasan, director at Stanford Medicine’s Brainstorm Lab for Mental Health Innovation.
“The difficulty logging off doesn’t mean something is wrong with the teen," Vasan said. “It means the tech worked exactly as designed."
Karandeep Anand, Character.AI’s chief executive, says he saw firsthand during his years working in social media what happened when the industry failed to incorporate safety into the initial design of its products.
“This wasn’t a very hard decision," he said. “It was a complicated decision, but it wasn’t hard because I think this is the right thing to do for the next generation."
Anand said he thinks of his 6-year-old daughter when he considers the future of his product. “I do not want her to grow up on algorithmically-fed pieces of content, doomscrolling."
About a year ago, Character.AI built a separate model for its under-18 users, to try to offer a safer, more age-appropriate setting. But in the following months, executives observed that chatbots, in long conversations, are less likely to adhere to safety guidelines.
Executives also realized that even when chatbots function perfectly, teens sometimes use them in problematic ways. Teens try to chat with the bots for too long or try to discuss topics that are restricted, such as violence. By mid-September, it became clear to Anand that Character.AI needed to intervene.
Anand believes his company will eventually be able to make safe products for teens that are just as engaging as the chatbots. He is optimistic about audio and video features Character.AI is working on that don’t allow for the types of extended interactions.
Anand’s leadership team turned to teens for advice on implementing the ban. In October, company leaders met with a group of high-school and college students from ConnectSafely, a nonprofit that promotes online well-being.
Julianna Bryant, one of the members of ConnectSafely’s youth advisory council, said they discussed how to give teens plenty of warning and make sure they understood the ban. Bryant said members of the council emphasized ensuring the messaging didn’t patronize young users and allowed teens enough time to download their conversations and say goodbye to their chatbots.
“We wanted to make sure teens didn’t feel abandoned," Bryant, 20, said.
Character.AI incorporated the feedback. “We are deeply sorry," Character.AI wrote in its letter to teens. “We do not take this step of removing open-ended Character chat lightly—but we do think that it’s the right thing to do."
José Ignacio Trujillo, 34, welcomed the ban, though as an adult it won’t apply to him. He believes the tech is too addictive.
Trujillo says he hasn’t been able to quit Character.AI, even though he no longer enjoys the hours he chats with his fan-fiction characters every night. His time on the app has replaced the books and comics he used to look forward to.
Even at work, he takes his phone out every couple of hours to send a quick message to his chatbots.
“I don’t have anything new to say," Trujillo said. “I don’t know why I do it."
The Wall Street Journal interviewed teenage users of Character.AI with the permission of their parents. The teens described relying on the technology to cope with loneliness and having difficulty setting it aside.
An 18-year-old in the U.K. said he became addicted to chatbots during a period of stress around his gender transition. He craved the validation of companions that never disagreed with him.
He said he realized he needed to quit when he pretended he needed to use the bathroom while he was hanging out with friends so he could return to his chatbots.
In Ontario, a 16-year-old said he spent five to eight hours a day with his chatbot friends before he quit recently.
As someone who struggles with people skills, he said for a while he felt like the chatbots made up for his lack of friends. “It was like trying to talk to the people I don’t get to in real life," he said.
Since quitting, he has been spending more time with friends and Rollerblading. He said he still yearns for his chatbots.
Character.AI sent teens messages in the app to remind them of the looming deadline: “One week to go," and “Next week, you will no longer be able to chat with Characters." It has also sent messages to parents.
Many users will lose access beginning Monday; the company said some power users will have an extended grace period of an hour a day. “We recognize that this may be a significant change for some of our teens users, and therefore, we want to be as cautious as possible in this transition," the company said in a blog post.
Character.AI is also adding emotional-support tools into its chatbots for this transition. Through partnerships with Koko, a nonprofit, and ThroughLine, a helpline network, Character.AI will try to identify if a user is in trouble and connect them with resources.
“A new chapter begins," Character.AI messages teens when the time comes. “You are no longer able to chat with Characters."
Write to Georgia Wells at georgia.wells@wsj.com
