Train the Brain, Not the Bot.

Views today: 2

AI isn’t the enemy of critical thinking. Poor education is.

A few days ago, I was reading Jessica Grose’s column in The New York Times, titled “A.I. Will Destroy Critical Thinking in K–12” . What a title! I highly recommend reading it. But what really stuck with me was the story she told: a car full of middle schoolers gossiping about teachers, and one of them scoffing, “I bet she uses A.I. to grade our papers.” Whether it was true or not, it said everything. Even seventh graders already sense what’s at stake: that AI might mean less personal care, less trust, more shortcuts.

But let’s pause. Is critical thinking really in danger? AI isn’t the enemy of thinking. Poor education is. If we combine smart tools with human wisdom, we can create classrooms where critical thinking doesn’t just survive – it thrives.

The real danger? How we use AI.

The real danger isn’t AI itself – it’s how we use it. That’s why we need two things in every classroom: guidance and how to use AI well, and wisdom to keep human connection and learning skills at the heart of education. I founded enduri, a learning platform, with these two beliefs in mind. First: digital tools should support human interaction, not replace it. Second: learning skills such as critical thinking aren’t something kids just “have”—they’re skills they can train, like a muscle. That takes strategy, coaching, reflection, and yes, productive struggle. No pain, no brain gain.

This article explores:

  • What kind of guidance schools need around AI.
  • What can – and can’t – be outsourced to AI.
  • Why human connection still matters (more than ever).
  • How AI challenges thinking – and how platforms like enduri turn that into an opportunity.
  • Why “learning how to learn” and tools like enduri are more essential than ever.


AI can’t replace a teacher. It doesn’t know when a student had a rough night, or when a quiet child needs encouragement. It can’t empathize, or build trust.


AI in School: Guidance Desperately Needed

What counts as cheating?

Let’s be honest: AI entered classrooms before anyone really knew what to do with it. In 2024, two-thirds of U.S. teachers were already using generative AI tools like ChatGPT – while fewer than 10% of schools had clear rules in place [2]. Globally, a UNESCO study found that most institutions had no official AI policies at all [3].

This “Wild West” scenario leaves everyone – teachers, parents, students – uncertain. Can students use ChatGPT for homework? How do we spot AI-generated essays? What counts as cheating? More importantly: how do we protect student data, privacy, and trust?

Clarity instead of a ban

Parents are asking the same questions. In a World Economic Forum survey, 81% of parents and 72% of students said they’d welcome school-wide guidance on using AI responsibly [3]. They don’t want a ban. They want clarity. Some governments and NGOs are responding. UNESCO published global guidance in 2023 calling for teacher training and transparency [4]. South Korea, Japan, and the EU are crafting national strategies – with the EU classifying education-related AI systems as “high risk,” requiring strong oversight and accountability [5].


Still, most schools lag behind. Without clear, age-appropriate frameworks, we risk creating systems that confuse students, overload teachers, and increase inequality.


What Should AI Do—And What Should It Never Do?

AI can’t replace teacher judgment.

So how do we decide what AI should handle in school? The answer: AI is great at repetition. It’s terrible at judgment.

Let AI handle routine tasks – grammar checks, math drills, quiz generation. Studies show this can reduce teacher workload and provide students with instant feedback [6].

But don’t expect AI to evaluate creativity, assess emotional nuance, or understand what makes a student tick. Even well-trained systems often “hallucinate” – producing confident but false answers [1]. Worse, AI can reflect bias from its training data, potentially disadvantaging marginalized students [7].

Let’s be clear: AI can’t replace teacher judgment. It doesn’t know when a student had a rough night, or when a quiet child needs encouragement. It can’t inspire, empathize, or build trust.


Why We Still Need Humans at the Center

Education is built on relationships

Grose said it best: when students feel their work is being processed by machines, not people, it damages trust in the learning process [1].

Education is built on relationships – between teachers and students, students and peers, learners and mentors. Studies show that strong student–teacher connections improve motivation, academic outcomes, and social development [8]. Especially in elementary and middle school, kids need adults who see them.

A 2024 global survey confirmed it: 73% of students and 72% of teachers said AI will never replace the human role in classrooms [9]. That’s not nostalgia – it’s common sense.


Can AI Help—or Harm—Critical Thinking?

The fine line between harm and help

Now to Grose’s core concern: does AI weaken students’ ability to think for themselves?

If used passively – absolutely. In a 2025 study, students who used AI for test prep performed better on practice problems – but worse on the final test, which didn’t allow AI. Why? They hadn’t actually learned – they’d leaned on the tool [10].

This is the real risk: students becoming dependent on tools that give answers, without learning to ask better questions. As Grose put it, AI encourages “surface perfectionism without developing the tools and stamina necessary for true critical thinking” [1].

But here’s the hopeful part: used actively, AI can sharpen thinking. Students can critique AI answers, compare alternatives, test arguments, and reflect on their reasoning. But to do this, students need guidance, learning skills, and strategies. That’s exactly the approach we’ve built into enduri.


AI as a learning sparing partner.


How enduri Deals with AI

Instead of content: learning strategies

enduri is a learning platform focused on ages 9 – 12; a group I call the “golden window” for learning how to learn. At this age, kids are ready for self-reflection and planning, but still curious and open-minded – and not yet overwhelmed by the screen-and-AI tsunami. It’s the perfect time to build habits like questioning, goal-setting, and critical thinking [11].

Instead of AI: skills to deal with AI

enduri is not an AI tool. It’s the tool that gives students the skills to deal with AI – to be in charge of AI, not the other way around. Rather than serving up pre-made content (AI-generated or not), enduri teaches how to learn: how to summarize, how to plan, how to reflect. It’s all powered by human brainwork, supported by a digital structure. Cooperation, communication, creativity, and critical thinking are at its core. After each unit, students reflect: What worked? What didn’t? What could I try next time?

Teachers get access to Learner IDs, progress dashboards, and coaching templates. Students receive personalized strategy suggestions – no AI-generated shortcuts, just metacognition, feedback, and growth. The name enduri comes from “endurance” – it’s learn-to-learn work, packaged in an inclusive, gamified, colorful, and interactive way.

And yes, there’s enduRO – a friendly AI sidekick that reads text aloud, answers simple questions, and offers motivational nudges. He’s there to save time and support kids. But the thinking? That’s still all human. Think of it like a trampoline: enduRO gives you a bounce, but you still have to jump. Learning can be fun – but it’s still work.


Why We Still Need Learning Platforms

AI gives us content, it doesn’t give us learning

In an AI-saturated world, do we really need platforms like enduriYes. More than ever.

Because while AI gives us content, it doesn’t give us learning. Learning takes effort, coaching, strategy, and support. That’s what enduri teaches – and makes visible, trainable, and lasting.

We don’t need more answers. We need better learning habits empowering students to ask smart questions. And if we teach those early – especially between ages 9 and 12 – we build mindsets that last.

AI is a huge challenge for teachers, parents, and students. But every challenge comes with risk and opportunity. Let’s meet it with clarity, not fear. 


Sources

  1. Jessica Grose, “A.I. Will Destroy Critical Thinking in K-12,” The New York Times, May 14, 2025. Link
  2. Micah Ward, “Teachers love AI, but they need more guidance from schools,” District Administration, 2023. Link
  3. Hadi Partovi & Pat Yongpradit, “AI and education: Kids need AI guidance in school. But who guides the schools?” World Economic Forum, Jan 18, 2024. Link
  4. UNESCO, “Guidance for generative AI in education and research,” Sept 2023. Link
  5. European Commission, “Artificial Intelligence Act – Proposal,” 2023. Link
  6. Shallon Silvestrone & Jillian Rubman, “AI-Assisted Grading: A Magic Wand or a Pandora’s Box?” MIT Sloan EdTech, May 9, 2024. Link
  7. Jill Barshay, “University students offload critical thinking to AI,” Hechinger Report, May 19, 2025. Link
  8. “6 Things Teachers Do That AI Just Can’t,” Education Week, Aug 31, 2023. Link
  9. Copyleaks, “Bridging the Gap: AI Adoption in Education,” 2024. Link
  10. Karatas, C. et al., “Generative AI and Student Learning Outcomes,” University of Pennsylvania, 2025. Link
  11. Christa Wüthrich, “Learning Strategies: Why Timing Is Everything,” enduri Blog, 2025. Link

Leave a Reply

Your email address will not be published. Required fields are marked *