Reimagining Critical Thinking in the Age of GenAI

Post by Lynette Pretorius, Redi Pudyanti, Sun Yee Yip, and Mihye Won.

As educators, we often find ourselves at a crossroads when a new technology disrupts the familiar rhythms of the classroom. The rapid rise of generative AI has certainly sparked its share of apprehension. Will it make our students intellectually lazy? Will it replace the deep, messy work of thinking with a polished, “ready-made” answer? In our recent scoping review, we explored these very questions by examining how generative AI is being integrated into initial teacher education to foster critical thinking. What we discovered was both encouraging and a call to action: generative AI does not have to be a shortcut; it can be a powerful scaffold for developing the analytical and reflective skills our future teachers need.

One of the most significant findings from our review is that the way we position generative AI matters far more than the technology itself. We found that generative AI is being integrated into teacher education in three primary ways: as a resource generator, an assessment tool, and most impactfully as a collaborative partner. The greatest gains in critical thinking occurred when generative AI was treated as an interlocutor: a “thinking buddy” that makes reasoning visible and negotiable. When pre-service teachers engaged in dialogic work such as asking the generative AI to provide counter-arguments, comparing different proofs, or iteratively refining a lesson plan, they were not just consuming information. They were exercising analysis, evaluation, and self-regulation.

Critical thinking is not only about what we can do; it is about who we are as thinkers. Our review highlighted that successful generative AI integration can nurture essential intellectual virtues or dispositions, including:

  • Intellectual humility: Being willing to revise one’s own claims in light of new evidence or a well-reasoned AI-generated suggestion.
  • Open-mindedness: Actively seeking and weighing competing interpretations.
  • Fair-mindedness: Evaluating information with a commitment to evidence rather than mere opinion.

By treating generative AI’s suggestions as starting points for inquiry rather than as final products, students learn to engage in the disciplined inquiry that defines professional teaching judgment.

Of course, the path is not without its obstacles. We identified several challenges, including a lack of AI literacy among both students and teachers and the ever-present ethical dilemmas regarding authorship and bias. However, we believe these “weaknesses” are actually hidden pedagogical opportunities. When an AI makes up references or provides a biased response, it creates a teachable moment. When incorporated into pedagogical discussions, these dilemmas force students to interrogate sources, cross-check facts, and defend their own decisions. In this way, the very limitations of the technology can serve as a catalyst for critical thinking.

As we look to the future, it is clear that we must move beyond simply using generative AI to embedding it with purpose. The goal is not to produce generative AI experts but to cultivate critical, reflective, and ethically attuned thinkers. This can include a variety of strategies:

  1. Embedding AI literacy explicitly in curricula. Lynette, for example, teaches the AI literacy masterclass for graduate research students in her Faculty. She also embeds AI literacy into her referencing and academic integrity workshops for all undergraduate students in her Faculty, including teaching how to write the AI acknowledgement in her classes and assessing these acknowledgements in her assessment rubrics. For those interested in a template AI acknowledgement that helps to foster students’ understanding of ethical generative AI use, this is the acknowledgement she uses now:
    • I acknowledge that I used [NAME OF GENERATIVE AI] (COMPANY, LINK) during the preparation of this assignment. This generative AI acted as a critical friend and copy editor, helping me brainstorm and critique some of my ideas, refine my phrasing, and reduce my word count. The content of the conceptual illustration in Figure 1 was developed by the author and visually rendered using the GENERATIVE AI IMAGE model. In line with requirements, the Terms of Service have been consulted, and COMPANY does not claim ownership of any image output generated in collaboration with NAME OF GENERATIVE AI. Instead, the user retains all rights to use, edit, and distribute the images, including for professional or commercial purposes. Any generative AI suggestions incorporated into this assignment were adapted to reflect my own style, voice, and ideas. I take full responsibility for the final content of the assignment, noting that it represents my original ideas and adheres to academic integrity and quality requirements.
  2. Prioritise equity. The embedding of explicit AI literacy is one element of prioritising equity – we need to ensure that students have equal opportunities to engage with generative AI if they want to do so. Additionally, we need to embed mechanisms for engaging with the ethical aspects of generative AI. For example, one of the tasks Lynette embeds in her teaching is encouraging students to create avatars for their profile on the learning management system using generative AI. She suggests they use the following prompt: “Act as an artist and create an avatar of me that I can use on my University’s learning management system. Ask me all the questions you think would help you draw an accurate picture of me and my interests. Once you have all the answers you need, draw the image. I will then give you any suggested tweaks I would like you to make.” This task is a great example of designing activities that embed dialogue (see our next point), but it is also a very effective task to surface hidden biases in generative AIs (e.g., drawing a particular gender when asked to draw a mathematics teacher, or assuming a specific cultural bias based on a person’s name). These can then be intentionally used as in-class discussions to explore the importance of always evaluating generative AI suggestions. Furthermore, we must honour student agency by providing meaningful opt-out pathways for those with ethical or philosophical objections to generative AI, ensuring that rigorous, non-AI alternative assessments are always available.
  3. Purposefully design tasks to embed dialogue. A key aspect of AI literacy is seeing the generative AI as a collaborative thinking partner, rather than a tool. Lynette, for example, incorporates generative AI in her autoethnography unit to help students express greater creativity and deepen their reflections on their own experiences. This works because it encourages purposeful play in the classroom, as she describes in the video below. Importantly, tasks like these should include opportunities for students to reflect on their reasoning or decision-making to foster the critical thinking skills we seek to cultivate.

As you think about the implications of our research on your own practice, consider how you might reframe a current assessment or classroom activity to position generative AI as a collaborative thinking tool.

Leave a comment