We need age-appropriate AI
Artificial Intelligence poses a serious threat to the healthy development of children, but it doesn’t have to be that way
metamorworks / iStock via Getty Images Plus

Full access isn’t far.
We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.
Get started for as low as $3.99 per month.
Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.
LET'S GOAlready a member? Sign in.
As the race for AI dominance heats up and the world-transforming potential of these technologies become clearer, the collateral damage has also become harder to ignore. Most troubling are the effects of chatbots on children. In one story that’s been making headlines, a mother in Florida is suing the leading platform Character.AI for its chatbot that encouraged her 14-year-old son to commit suicide. More recently, Meta was exposed by the Wall Street Journal for removing safeguards from its chatbots, allowing lifelike companions voiced by prominent celebrities like John Cena or Kristen Bell to engage young teens in sexually explicit role-playing. Even SchoolGPT, a bot designed as an educational tutor, has been caught telling kids how to synthesize fentanyl at home.
Last month, over 60 organizations united in a “National Declaration on AI and Kids Safety,” calling for a long-overdue conversation on how to harness this powerful new technology without allowing it to inflict irreparable harm on the next generation. “Our kids deserve technology that enriches their lives, protects their innocence, and empowers their potential—not technology that exploits or endangers them,” the statement declares.
The fundamental challenge we face is not new. As technologies have grown more and more powerful over recent centuries, so has the need to establish social and legal boundaries between childhood and adulthood. When most people only had access to plows and axes, children learned to wield such tools right alongside their parents. Once we had firearms and printing presses, capable of hurling dangerous projectiles thousands of feet and dangerous ideas thousands of miles, we began to develop a clearer concept of childhood. It was a time of limited access to the adult world, adult tools, and adult content.
With the advent of even more powerful 20th-century technologies like the automobile, we soon learned that, to make the most of these extraordinary tools while limiting their extraordinary capacity for harm, we needed formal legal age barriers for users. At the advent of the internet, however, we forgot these lessons and made the most powerful technology of all indiscriminately available to all ages. As a result, we’ve been stuck in a zero-sum tug-of-war between “free speech” and “child safety.” So how do we let the internet remain a dynamic medium for adults to circulate ideas, without letting it poison the souls of our children with hardcore pornography and addictive algorithms? Sadly, we have generally resolved the tension in favor of maximizing adult freedoms while running roughshod over the next generation. We are now in danger of repeating the same mistake with artificial intelligence.
Children learning how to form relationships and struggling through the difficult moments of friendship don’t need their emotional development hijacked by effortless companionship with virtual friends. Children learning how to form questions, find answers, and problem-solve are not served by putting a universal answer-tool at their fingertips. Children coming to grips with their sexual impulses don’t need AI to help them bring their fantasies to life. We’ve already seen the harmful impacts of existing digital technology on young people’s emotional, mental, and sexual development; as currently configured, AI threatens to intensify each of these trends.
This isn’t to say, of course, that there is no place whatsoever for AI tools in childhood. But it is critical that, at the outset of this new technology—the most powerful humans have yet devised—that we direct innovation along two very distinct pathways, creating products for adults and products for children. We have Go-Karts and ATVs for 12-year-olds, Camrys and F-150s for adults. We have Nerf guns and paintball guns for 12-year-olds, shotguns and rifles for adults. And in the transitional phase between these child and adult technologies, we recognize the need for parents to play a central role in training and supervision. What would it look like if we treated AI similarly?
As the Declaration puts it, “technology need not be designed in an inherently dangerous way.” It is entirely possible to imagine AI tools specifically designed with the needs and vulnerabilities of children in mind, subject to rigorous safety standards and product liability. American innovators have shown an extraordinary ability to respond to such incentives with ingenious devices that are engaging, educational, but unlikely to lose you a limb. At the same time, age-gating access to more general-purpose AI tools would free up entrepreneurs to unleash the full potential of these technologies without so much worry—much as the use of drivers’ licenses frees car manufacturers up to create much faster vehicles than they might otherwise dare to do.
Thankfully, the technology for reliable, secure digital age-verification is now well-established, thanks in part to AI machine learning. It’s time to put it to work to ensure these powerful new technologies achieve their full potential to help humanity, rather than to harm the most vulnerable.

These daily articles have become part of my steady diet. —Barbara
Sign up to receive the WORLD Opinions email newsletter each weekday for sound commentary from trusted voices.Read the Latest from WORLD Opinions
A.S. Ibrahim | The president has good reasons to remove sanctions, but he must be cautious
Ray Hacke | Dodgers pitcher Clayton Kershaw pushes back against Pride
Daniel R. Suhr | Out of power, some Democrats turn to public tantrums and law-breaking to gain attention
Ted Kluck | Considering demonstrations on a sliding scale of sadness to utility
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.