AI and Homeschooling: How to Protect Critical Thinking in the Age of Chatbots
AI and the Problem of “Cognitive Offloading”
Ever since the emergence of generative AI in 2022, there have been concerns about how its use affects analytical thinking skills. The fear is that increased use of advanced AI models will adversely affect users’ ability to think critically. That is, to engage their own intellect in observation, analysis, and problem-solving. The question is whether the mainstreaming of AI will produce a generation of weak thinkers, people who rely on bots to tell them the answer rather than engaging in the intellectual work of discovering it.
This would be a gravely alarming state of affairs, as an unthinking populace is vulnerable to manipulation, misinformation, and life’s uncertainties. A person who cannot think critically is forever dependent on others (be they people or bots) to direct them, as they lack the agency to understand their situation and chart a course for themselves.
But what does the evidence say? In this article, we examine research published since 2022 on how AI affects critical thinking skills.
Automating Mundane Tasks
Three recent studies (Lee, 2025; Jackson, 2025; Gerlich, 2025) examined the relationship between AI use and critical thinking skills in the workplace and classroom. The studies suggested AI usage was a double-edged sword.
On the one hand, AI proved a powerful tool capable of enhancing analysis and problem solving when it was used to automate mundane tasks. For example, tasks related to data retrieval and entry, formatting, or sorting. AI not only freed users’ time considerably but also enhanced their problem-solving ability by allowing them to focus more exclusively on their core tasks, reducing time spent on routine tasks. In other words, AI proved to be a net benefit to the user’s critical thinking when the tasks it was asked to automate did not require higher-level thinking skills.
We might compare this to a writer using CTRL+F to find a word in a long document. Using CTRL+F to find a word does not impair critical thinking skills, because the alternative—scanning pages of text manually looking for your keyword—is not a higher-level thinking skill to begin with. CTRL+F saves time by automating what would otherwise be a tedious chore and frees you up to be more productive with your writing.
The Problem of Cognitive Offloading
The situation changes dramatically when users begin to rely on AI models to perform high level thinkng skills. Asking AI to assist with mundane tasks is one thing, but researchers have noted that people who rely too much on AI to perform higher-order thinking skills may experience excessive cognitive offloading.
Cognitive offloading is the externalization of mental labor onto external tools, essentially “outsourcing” thinking to something outside the head. (Nosta, 2025) Cognitive offloading happens all the time. For example, when you write down a shopping list, you are cognitive offloading. You are externalizing your shopping list on paper, so you don’t need to expend the mental effort to memorize it. The difference is that AI enables users to cognitively offload complex mental tasks, thereby bypassing the so-called “deep thinking” required by traditional problem-solving.
The research suggests that frequent (e.g., daily) reliance on AI for complex problem solving is strongly correlated with a decline in critical thinking skills. People who rely on AI for problem-solving exhibit a decline in their ability to independently reason. When users cognitively offload “deep thinking” tasks to AI, the parts of the brain responsible for critical analysis essentially say, “Oh good, I’m not needed, I can shut down.” Just like your brain says, “Oh, good, I don’t need to memorize my shopping list” after you write it down. When users become reliant on AI over a prolonged period (the study’s range varied from a few weeks to several months), the problem becomes more pronounced.
Researchers noted that the more trust users had in their AI algorithms, the less likely they were to independently verify AI’s output. Essentially, they became content to cognitively offload their entire critical reasoning to the AI, thereby becoming passive in the act of cognition.
Of particular concern was that this trend was more pronounced among younger populations, who are less likely to have had their formative educational experiences before the rise of AI. All the studies noted that younger people seemed more affected than their older peers.
Bolstering Critical Thinking Skills
As parents and educators, how can we ensure that our children maintain robust critical thinking skills? Given the inevitability of their interaction with AI (and no matter what you may think, it is inevitable in the long run), how can we give them healthy thinking skills so they don’t offload all of their mental effort to algorithms?
Here are four suggestions based on the research
Encourage Active Exploration of Data
Instead of letting AI give final answers, have your children interpret AI-generated data themselves. Ask them to test hypotheses, identify patterns, and gather additional evidence. When used thoughtfully, AI becomes a tool for genuine scientific inquiry—such as having students find data to support or challenge conclusions about real-world topics. This encourages children to actually grapple with data and think about what they are observing instead of just letting the AI tell them what they’re looking at.
Use AI to Build Stronger Arguments
Have children use AI to collect evidence for debates or arguments, but require them to fact-check everything it provides. (I can personally attest that it frequently makes things up out of thin air.) You could also give AI a scenario (e.g., a hurricane’s path), get two explanations—one correct, one plausible but wrong—and ask your child to do research in order to decide which is accurate and explain why.
Require Claim-Evidence-Reasoning
Claim-Evidence-Reasoning (CER) is an observation-based learning approach in which students act like scientists by asking questions, forming hypotheses, and testing ideas. AI can help by creating simulations or offering explanations. However, students must always critically evaluate AI outputs and clearly explain their reasoning using the CER framework. It is basically the scientific method as applied to student work. (Note that I plan to discuss the CER method more deeply in a future article.)
Always Treat AI as a Resource, Not a Quick Answer
Finally, always present AI as a starting point for exploration, not a way to skip thinking. Encourage students to treat it like a discussion partner—use its data to spark class debates, group projects, or deeper questions. Prompt them to ask things like: “Where did this data come from?” “Is it biased?” or “What else do we need to know?” This keeps students actively involved in solving problems. This is not dissimilar from what parents and educators had to do a generation ago with the rise of Google: children had to be taught to use Google as a research tool, not as a substitute for their own work.
Conclusion
The continued spread of AI is changing society and education every day, and we cannot yet foresee how it will all pan out. We can, however, see a few emerging trends, and research tells us that if we want our children to grow up with solid critical thinking skills, we must go out of our way to ensure they do not cognitively offload their God-given reason to AI bots.
What are your thoughts on this topic? Join other homeschooling parents and me in the Homeschool Connections Facebook Group or in the HSC Community to continue the conversation.
Works Cited
Gerlich, M. (2025). AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking. Societies, 15(1), 6.
Jackson, J. 2025, January 13. Increased AI use linked to eroding critical thinking skills. Phys.org. https://phys.org/news/2025-01-ai-linked-eroding-critical-skills.html.
Lee, H.-P., A. Sarkar, L. Tankelevitch, I. Drosos, S. Rintel, R. Banks, and N. Wilson. 2025. The impact of generative AI on critical thinking: Self-reported reductions in cognitive effort and confidence effects from a survey of knowledge workers. In CHI Conference on Human Factors in Computing Systems (CHI ’25), April 26–May 1, 2025, Yokohama, Japan. ACM. https://www.microsoft.com/en-us/research/wp-content/uploads/2025/01/lee_2025_ai_critical_thinking_survey.pdf.
Nosta, J (2025, January 19). Cognitive offloading with AI boosts performance but may hinder deeper learning. Psychology Today. https://www.psychologytoday.com/intl/blog/the-digital-self/202501/the-shadow-of-cognitive-laziness-in-the-brilliance-of-llms
