AI in Classrooms—A Double-Edged Sword?
Rising AI use in schools can weaken critical thinking, student-teacher bonds, peer interactions, while proper AI training is inadequate
The rapid rise of artificial intelligence tools in K-12 classrooms is raising concern among educators, parents, and researchers who say students are losing more than they gain.
At the heart of the issue are fundamental questions about the true goals of education: are we crafting future adults with critical thinking and reasoning skills to fully participate in the discourse of the nation? Or, are we merely churning out little prompt-bots, content to be part of a garbage in-garbage out (GIGO) loop producing endless reams of AI slop?
A report from the Center for Democracy and Technology found that 85 percent of teachers and 86 percent of students used AI during the 2024-25 school year. However, many of the effects on student learning are proving troubling, including weaker relationships, less critical thinking, and blurred lines of academic honesty.
Half of surveyed students said using AI in class made them feel less connected to their teachers. Nearly half of parents and teachers also noted declines in peer-to-peer interaction tied to AI reliance.
Seventy percent of teachers said they worry AI weakens students’ ability to think deeply and conduct real research—skills long seen as core to success in school and beyond.
Perhaps even more worrying, “less than half of teachers (48%) have participated in any training or professional development on AI provided by their schools or districts; and less than half of students (48%) said someone at their school provided information to students on how to use AI for schoolwork or personal use.”
“As many hype up the possibilities for AI to transform education, we cannot let the negative impact on students get lost in the shuffle,” said Elizabeth Laird, Director of the Equity in Civic Technology Project at the Center for Democracy and Technology.
“Our research shows AI use in schools comes with real risks, like large-scale data breaches, tech-fueled sexual harassment and bullying, and treating students unfairly,” Laird said. “Acknowledging those risks enables education leaders, policymakers, and communities to mount prevention and response efforts so that the positive uses of AI are not overshadowed by harm to students.”
The report prompted the Center to send a Letter to the U.S. Department of Education, urging the Department “to incorporate the Principles for Responsible Use in carrying out the directions in Executive Order 14277, Advancing Artificial Intelligence Education for American Youth.”
While students and teachers alike are using AI for tutoring, career advice, and even personal matters, critics say this can sideline the effort and reflection that make learning meaningful.
Joseph South, Chief Innovation Officer for ISTE + ASCD, a nonprofit supporting teachers, says schools must act to “help teachers and students use [AI tools] in the right and best ways.” At present, less than half of teachers and students report receiving any formal guidance about AI from their schools.
Some educators take a stronger stance. English teacher and writer Meg Marie Johnson argues in The Federalist that introducing AI into classrooms risks “destroying actual intelligence” by encouraging students to rely on AI for the processes of thinking and writing. Johnson contends that “outsourcing parts of the process to algorithms and machines is outsourcing the rewards of doing one’s own thinking,” because “these are not steps to be skipped over with a “tool,” but rather things people benefit from learning if they value reason. Strong writing is strong thinking.”
To Johnson, the purpose of education goes beyond merely being able to produce an acceptable end product using a computer:
“If the goal is simply to produce outcomes, one could argue that AI usage should not just be tolerated but encouraged. But education shouldn’t be about producing outcomes—whether it be a sparkling essay or a gripping short story—but shaping souls. The purpose of writing isn’t to instruct a prompt or even to produce a quality paper. The purpose is to become a strong thinker and someone who enriches the lives of everyone, no matter their profession.”
Citing a recent MIT study showing that AI usage decreases cognitive function like critical thinking, Johnson asks an uncomfortable question:
“Seems rather odd to insist that something proven to weaken our brains should be introduced to places where institutions of learning, isn’t it?”
Johnson and other critics say the emphasis should remain on building reasoning, perseverance, and human interaction—qualities they fear are fading as students turn to AI to do work for them.
As the debate over educational AI continues, schools, parents and policymakers must decide how to balance the technological promise of AI against the foundational mission of the education system: producing adults capable of keeping the Republic.
The report from the Center for Democracy and Technology, Hand in Hand: Schools’ Embrace of AI Connected to Increased Risks to Students, is available HERE.
Editor’s note: the MIT study referenced above by Kosmyna et. al., Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task utilized electroencephalography (EEG) to record participants' brain activity to assess cognitive engagement and cognitive load during various essay writing tasks. The study found that “The use of LLM had a measurable impact on participants, and while the benefits were initially apparent, as we demonstrated over the course of 4 months, the LLM group's participants performed worse than their counterparts in the Brain-only group at all levels: neural, linguistic, scoring.”
The complete study is available as above, as well at THIS LINK.