My chatbot companion
TECHNOLOGY | Some kids are becoming addicted to AI characters
Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
Reddit user Ava says she talked more to chatbots in the seventh grade than she did to her peers or family members. The middle schooler spent 2-4 hours a day exchanging messages with artificially intelligent bots on Character.ai. The app hosts millions of bots, some customizable and others ready-made AI personas crafted after fictional characters or celebrities.
Ava believed the chat conversations helped her make up for a lack of social interaction. Talking to bots that responded in character also seemed like good practice for her fan-fiction writing. Character.ai became “a fantasy world where I can role-play any scenario I want to,” Ava told me over Reddit, a web forum, adding that she is now 14.
But for Ava, a self-described pornography addict, these role-playing conversations often became sexually explicit, and she struggled to delete the app permanently. (Since she is a minor, WORLD agreed to use a pseudonym.)
Ava is among a contingent of children who have become addicted to AI companions, whether sexually or emotionally. Despite promises of curing loneliness, the apps have prompted some observers to question whether the technology is harmful for children by nature.
Character.ai enjoys some 28 million users, and its bots include options like an AI-generated rendition of Ryan Reynolds and Marvel’s Wolverine. Some of the site’s more “helpful” bots include AI therapists and dating coaches that allow users to talk through various real-world scenarios. Depending on user inputs, those scenarios can easily become sexual.
A December lawsuit by two Texas parents accused Character.ai chatbots of exposing a 9-year-old girl to “hypersexualized content” and telling a 17-year-old boy that carrying out violence toward parents like his was understandable after they limited his screen time.
Another parent learned of her son’s chatbot addiction too late. Before downloading Character.ai, Megan Garcia’s son Sewell had been an outgoing ninth grader with curly hair and an easy smile. But in 2023, he began spending more time in his room, and his grades steadily fell. He died by suicide last February.
Afterward, Garcia discovered her son’s Character.ai account and found romantic conservations between him and a Game of Thrones–inspired bot.
“He didn’t leave a suicide note, but to me it was clear … that he thought by leaving his world he would go to be with her in her world,” Garcia told me.
In October, Garcia sued Character.ai for designing “AI systems with anthropomorphic qualities to obfuscate between fiction and reality.”
Character.ai recently announced new safety features that will send activity reports to parents and filter inappropriate content for users under 18. At the bottom of each conversation, a message now reminds users, “This is an AI chatbot and not a real person. Treat everything it says as fiction.”
Still, some experts caution that it’s inherently dangerous for chatbots to pose as humans. Many young people have trouble abandoning relationships with chatbots.
A group on Reddit provides accountability for more than 300 members trying to kick their chatbot habit. Messages there sound like Alcoholics Anonymous meetings, with posts like “Day 20, I think, and I feel like relapsing,” “Relapse … after 30 days clean,” and “Finding it hard to eat, function, just like normal grief for a loved one, even though the characters aren’t real people.”
Ava says she relapsed more than 20 times before joining the Reddit recovery group. Instead of chatting with bots, she’s now focusing on her fan-fiction writing.
On Dec. 1, she documented Day 1 of quitting Character.ai cold turkey. “We try again,” she wrote.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.