Should elementary students use AI tools?
Experts and parents weigh possible benefits and safety concerns of artificial intelligence
FatCamera / E+ via Getty Images

Dallas-based Nate Fischer says that, while most of his son’s homeschool classes are low-tech, the 10-year-old began using a personalized AI tutor from the app Synthesis to supplement his math lessons last year.
“[My son] said something like, ‘I’m learning the same things I learn in my regular math class, but it’s just more fun when I do it in Synthesis,’” said Fischer. “They are able to make it fun and they are able to make it tailored to the person.”
Fischer said he heard about the app through his work. He leads a venture firm that is considering investing in Synthesis, whose co-founder Chrisman Frank says he is a Christian. Fischer felt he could trust the app because he knows the values of its creator.
For now, Synthesis just offers his son extra practice, Fischer said. But he could see AI technology becoming a bigger part of education for all five of his children, especially for subjects that have objective answers like math, he said.
According to a recent Common Sense Media survey of parents, nearly 1 in 3 students ages 8 and under use AI for learning, specifically for school-related materials, AI itself, or critical thinking skills. While experts are divided over the benefits and pitfalls of AI learning for children, they all agree that the new educational tool isn’t going away and that parents need to be involved with their children as they use it.
In the Common Sense Media study, around a quarter of the respondents said they feel that AI tools have positively affected their child’s understanding of school materials, while a majority feel it has no effect.
While well-known AI programs like ChatGPT or Google’s Gemini require users to be over 13 years old, Synthesis and similar apps focus on younger students. Ying Xu, an assistant professor at the Harvard Graduate School of Education who studies AI’s effects on children and families, said her 5-year-old daughter uses Khan Academy’s Khanmigo.
AI offers a more interactive approach than traditional media such as television because children can engage with the technology, Xu said, adding that this enhances the time children already spend using digital technology. According to the American Academy of Child and Adolescent Psychiatry, children between the ages of 8-12 spend an average of 4-6 hours a day on screens.
“Parents should actually embrace and be more open-minded and [have] transparent conversations with their kids instead of shutting down and telling them, ‘No, using AI is bad or dangerous,’” Xu said. “[They need a] willingness to navigate this space alongside their kids.”
Though AI tools could pose hazards, Xu said that “better-designed AI” can mitigate them. For instance, she said that some evidence shows students who use a general purpose ChatGPT learn less than those who used no AI tools. But students who used AI tutors have not experienced these “detrimental effects on learning,” she added.
A 2024 Barna study found that one-third of parents are strongly concerned about children’s data privacy regarding AI use, but only 17% of parents said they are actively seeking out information to better understand AI tech. Xu said that parents need to check an AI tool’s website to see what information is available and what user data it keeps.
Organizations like UNICEF are questioning how frequent use of chatbots for education can affect children’s cognitive, emotional and behavioral development. Xu acknowledged children can struggle to differentiate between human interactions and AI interactions, which could affect their speech or social development. She recommended that parents avoid using tools that have personified AI as a friend for children, because it blurs the line between humans and machines.
Some evidence shows what can go very wrong with that. Last fall, a Florida mother filed a lawsuit that claimed her son took his own life because of his reliance on a Game of Thrones-inspired AI chatbot. She alleged that her son had “abusive and sexual interactions” with the chatbot and that it encouraged him to commit suicide.
AI chatbots can seem very human to children and this perspective can be “quite detrimental,” said Jason Thacker, the director of the research institute at the Ethics and Religious Liberty Commission. It’s concerning how easily AI chatbots can substitute for companionship for children, he said.
Thacker added that Christian parents need to be aware that the technology isn’t neutral. The companies behind AI and the information the chatbots expose children to comes from a specific worldview.
Additionally, AI apps don’t always align with the purpose of education, which Thacker defined as to promote critical thinking, communication, and community. AI tools are good about transferring information but not about growing the whole person, he said.
“We can’t reduce education down simply to information transfer,” Thacker said. “Sitting in front of a screen all day, even with a personalized learning assistant—how are we getting active, involving our bodies, thinking, cultivating curiosity and wonder about the world God has created?”
Thacker pointed out that AI learning can be used well when it promotes pedagogical thinking. He tells his students to dig into a question on their own through discussion and research. Then he poses the same question to a chatbot, and his students analyze its answer.
AI resources are only going to keep getting better and more prevalent, he said, echoing recommendations that parents stay involved.
Bret Eckelberry, managing editor at Plugged In, said parents need to ask teachers if they use AI in class, and if so, for what activities or material. AI technology can benefit children by personalizing their learning and allowing them to merge subjects they are currently learning with their other interests, he said, echoing the experience of Nate Fischer and his son. For instance, AI tools could create a dinosaur-themed math worksheet for a child who loves the prehistoric creatures.
Parents should explore usage of the tools alongside their children to help develop open communication regarding it, Eckelberry said, so that if problems arise, children can come to their parents with questions.
“At Plugged In, we see AI as a tool, and like any tool it has the potential to be used well and appropriately. On the other end of the spectrum it can be used very, very poorly and it can be harmful,” Eckelberry said. “We want to encourage parents to just walk the [AI] journey with their children.”

I enjoy them immensely and share them every week. —Joel
Sign up to receive Schooled, WORLD’s free weekly email newsletter on education.
Please wait while we load the latest comments...
Comments
Please register, subscribe, or log in to comment on this article.