Artificial knowing | WORLD
Logo
Sound journalism, grounded in facts and Biblical truth | Donate

Artificial knowing

AI in education is about far more than “not cheating”


metamorworks / iStock via Getty Images Plus

Artificial knowing
You have {{ remainingArticles }} free {{ counterWords }} remaining. You've read all of your free articles.

Full access isn’t far.

We can’t release more of our sound journalism without a subscription, but we can make it easy for you to come aboard.

Get started for as low as $3.99 per month.

Current WORLD subscribers can log in to access content. Just go to "SIGN IN" at the top right.

LET'S GO

Already a member? Sign in.

A middle schooler sits at her desk, tasked with writing a short story. She opens her school-issued Chromebook, types a single prompt into an AI chatbot, and within seconds, a decently written, creative story fills her screen. The teacher sees a completed assignment. The student learns nothing.

Yet, the greatest danger of AI in education is not that it will help students cheat. It’s that it will reshape how they think, learn, and even understand truth itself. My concern is that, when introduced too early or possibly even at all, AI will not supplement learning; it will substitute for it.

Just as we would never hand a teenager the keys to a car without driver’s education, we should approach AI with similar caution—especially for children and teens.

There is an understandable excitement when it comes to AI in K-12 education. Some parents think it is necessary to equip their children for future jobs. Others, such as teachers, enjoy using it to supplement their content creation and create tests. Yet when AI comes up in education debates, the conversation often starts and ends with one concern: cheating. Some schools are shifting toward in-class handwritten assignments and even questioning the “busy work” model that often clutters K–12 education. While these adjustments address surface-level concerns, they don’t address the deeper question: how will AI shape how students think?

The problem is not merely that students may use AI to cheat on assignments—it’s that they may cheat themselves out of the very mental work that forms them into capable thinkers.

A recent Nature Reviews Bioengineering editorial, “Writing is Thinking,” captures this danger well. Writing is not simply a way to display what we know—it is a way of knowing. The act of drafting, wrestling with ideas, revising, and refining forces the brain to process information, clarify thoughts, and form original positions.

A premature use of AI in education risks depriving students of that experience. As the editorial explores, writing helps us uncover “new thoughts and ideas,” think in a linear and structured way, increase brain connectivity, and improve overall learning and memory. These aren’t just academic skills—they are foundational to human intelligence.

Proponents of AI tools in education, like Google’s Gemini platform, often claim it will spark independent thought and creativity in children. At first glance, that sounds exciting. With free AI tools, we can write song lyrics and generate music that sounds real. We can design entire websites and graphics without ever opening Adobe. We can produce essays at any level—from middle school to postgraduate—tailored to any worldview or expertise. But are we really producing anything or simply delegating AI? 

A recent MIT study found that students using ChatGPT to write essays retained less information, felt less ownership over their work, and accumulated “cognitive debt” from over-reliance.

But while promising to foster creativity, AI actually risks hollowing it out. Matthew Crawford, a research fellow at the Institute for Advanced Studies in Culture at the University of Virginia, reminds us that true creativity requires mastery. You must own a subject before you can deviate from it.

A recent MIT study found that students using ChatGPT to write essays retained less information, felt less ownership over their work, and accumulated “cognitive debt” from over-reliance. Researchers even observed lower brain activity. For children’s developing minds, this doesn’t just cut corners—it dulls the capacity for deep, sustained thought and critical thinking skills. In the study, students who relied on their brain power to write essays had higher neural activity, felt more engaged and curious, and expressed higher satisfaction over their work.

One of the most revealing “telltale” signs of AI misuse is when scholars or experts publish unsubstantiated facts or cite claims with studies or legal codes that do not exist. These are AI generated “hallucinations”—false or misleading claims delivered with complete confidence.

An adult might notice these mistakes and fact check. A child usually won’t.

Moreover, the problem isn’t just what AI says—it’s what it leaves out. Unlike a physical library or a scholarly database, AI is not a neutral index of information. It curates, filters, and sometimes subtly rewrites. I’ve tested this myself: When I sent a paragraph praising a specific action by President Trump, the ChatGPT lightly edited my text to remove his name, attributing the work to “Republicans in Congress.”

Such influence extends beyond politics. Large language models like ChatGPT have the power to narrow the range of thought available to a person—not just through errors, but by steering them toward certain interpretations and away from others. In this way, AI can quietly replace truth-seeking with narrative-shaping.

Real education isn’t about completing assignments nor is it merely about the accumulation of knowledge but about maturing in wisdom. As the Psalmist says, “Teach me to number my days, that I may gain a heart of wisdom.”

While AI may have limited educational uses, such as generating practice exercises, offering examples, or helping brainstorm, it should remain a supplemental tool and not a substitute for the hard but necessary work of cultivating wisdom.

Editor’s note: For more about AI in education, see WORLD Magazine’s cover story package, “ChatGPA” and “Deepfake learning,” and another of today’s WORLD Opinions columns,Digital poverty at St. Dunstan’s Academy.”


Emma Waters

Emma is a research associate in The Heritage Foundation’s DeVos Center for Life, Religion, and Family.


Read the Latest from WORLD Opinions

Katelyn Walls Shelton | By foregoing screens now, students will be able to use them better when they’re adults

A.S. Ibrahim | Premature recognition of a “State of Palestine” would embolden extremists

Nathan A. Finn | The new Superman and Fantastic Four blockbusters celebrate life and family values

Katy Faust | What the data really say about same-sex parenting

COMMENT BELOW

Please wait while we load the latest comments...

Comments