AI and Critical Thinking Skills: Teaching Students to Question the Machine
AI in classrooms is here to stay. Help students build critical thinking skills with fact-checking, analysis, and bias evaluation strategies.
Let’s be honest: it’s getting harder to tell what was written by a student and what was written by a machine. Whether you’ve caught a suspiciously polished essay or had students proudly submit AI-generated content, it’s clear that the presence of tools like ChatGPT is changing the academic landscape. But here’s the bigger question: Are our students equipped to think critically about the information AI gives them?
In the rush to discuss academic integrity and plagiarism policies, we may be missing a crucial opportunity. AI isn’t going away, so instead of simply banning its use, we should be helping students learn to evaluate it—just like we would teach them to assess any other source of information.
A Teachable Moment (Disguised as a Prompt)
This semester, I assigned a short writing task in my course. About a quarter of my students used AI to help with their drafts (some told me, others… not so much). But what struck me wasn’t that they used the tool—it was that many of them didn’t check whether the information was accurate, current, or even coherent. They assumed, “If AI said it, it must be right.”
That’s the mindset we need to challenge. AI is not an authority. It’s a pattern-based prediction engine, prone to "hallucinations" and biases. And that means it creates a perfect scenario to teach critical thinking: a source that sounds smart but might be wrong.
Build the Evaluation Muscle
Here are a few strategies I’ve started using to encourage students to approach AI-generated content with a more critical eye:
- Fact-Checking as an Assignment
Ask students to generate an AI response (to a prompt in your field), then spend the bulk of the assignment fact-checking it. Where is the information sourced from? Are there inaccuracies? What might a better response include? This works well for research, history, and even STEM explanations.
- Compare and Contrast
Have students compare an AI-generated summary of a reading or concept with the actual source. Where are the gaps? What nuance is lost? This is especially effective in fields where context, tone, or methodology matter—like literature, philosophy, or social sciences.
- Reverse-Engineering the Prompt
Show students a flawed AI answer and ask them to infer what the prompt was. Then have them revise both the prompt and the AI’s response. This helps them see how input affects output—and how vague prompts often lead to vague (or wrong) answers.
- Bias and Perspective Discussions
AI tools reflect the data they were trained on. That means cultural, historical, and ideological biases are often baked into their responses. Give students AI outputs related to topics like gender, race, politics, or global events and analyze the tone, framing, and assumptions. This is a great opportunity for interdisciplinary collaboration, too.
From Consumers to Editors
A mindset shift is needed: students shouldn’t be passive consumers of AI content. They should become editors—people who question, revise, challenge, and improve. We already teach this skill when students work with peer feedback or revise their own writing. The difference now is that the “peer” might be a chatbot.
The bonus? When students learn to critique AI, they often become more thoughtful about their own arguments. They ask better questions, consider multiple perspectives, and stop taking the first answer as the final one.
Resources to Explore
If you’re looking to integrate more critical AI literacy into your teaching, here are a few helpful places to start:
Let’s Keep the Conversation Going
Like you, I’m still experimenting. I don’t have all the answers, and AI is evolving faster than most of us can track. But what I do know is that this is a golden opportunity to reinforce the very skills higher ed has always valued: analysis, reasoning, curiosity, and skepticism.
Have you found creative ways to help students engage critically with AI? Let’s share and learn from each other—just like the machines are doing. Only better.