As generative AI reshapes higher education, writing professors are taking on a new challenge: teaching students when to engage in intellectual struggle rather than rely on AI tools. Dr. Sarah Thompson, a writing professor at Babson College, has become one of the first educators to publish academic research on the intersection of generative AI and critical thinking. She is now part of an interdisciplinary team examining how AI is transforming education, industry, and society.
From Hope to Grief
For Dr. Thompson, handling the rise of generative AI initially seemed manageable. In spring 2023, shortly after the release of ChatGPT, she asked her students to use the AI tool to research their favorite musical artist and then fact-check the results as part of a unit in her senior-level social media class. The students’ responses were polished and confident, but often incorrect. Album dates were scrambled, tours were invented, and one student exclaimed, ‘It lies!’ The classroom erupted in discussion, highlighting the risks of overreliance on AI.
By fall 2023, Dr. Thompson found herself grieving the loss of the pre-AI world. In her sophomore-level research writing class, she introduced a new required section titled ‘Be Better Than a Robot.’ This section challenged students to consider why they should not simply let ChatGPT write their research papers. ‘Where would your own work need to come in to create a tiny piece of new knowledge in the world?’ she asked them. The class practiced primary research, allocated time for reading and annotation, and extended deadlines to accommodate the rigor of the assignments.
Cognitive Blind Spots
Discussions around AI use in higher education have grown increasingly complex since 2023. While some argue for a ban on AI in classrooms, Dr. Thompson believes the focus should be on taking the technology seriously rather than outright prohibiting it. Recent data from the Pew Research Center shows that more than half of teenagers are using AI to find information and get help with schoolwork. By the time these students arrive in her classes, many have already developed habits around these tools, and these habits may or may not serve their learning.
A study published in the British Journal of Educational Technology in late 2024 found that students using ChatGPT improved their essay scores in the short term but showed no meaningful gains in knowledge. The study also identified a phenomenon called ‘metacognitive laziness,’ where students rely on AI tools to the point where they undermine their ability to self-regulate and engage deeply in learning. This is a result of cognitive offloading, where the brain shifts tasks to external tools, potentially weakening critical thinking skills.
Teaching Discernment
Dr. Thompson feels her role is shifting from that of a neutral observer or co-learner to a guide with a clear perspective. She knows what rigorous thinking looks like in her discipline and can distinguish between a paper that has gone through genuine intellectual struggle and one that has simply been assembled. Her job is to make this difference visible to students who may not yet have the experience to see it themselves.
In her writing courses, Dr. Thompson sometimes asks students to write without AI. Not as a purity test, but because understanding what AI does to your thinking requires knowing what your thinking can do without it. This is especially important now, as many college students arrive already anxious, performing, and improving for grades rather than for learning. Many have spent years learning to produce the right answer rather than to wrestle with hard questions.
Dr. Thompson emphasizes the need for students to develop a sense of trust in their own thinking. In practice, this looks like drafting with and without AI, comparing versions, and justifying choices out loud. It also involves noticing when AI accelerates routine work and when it flattens complexity. Learning to sit with uncertainty and tolerate the slowness and mess of thinking things through is where discernment begins.
As many faculty members handle this moment, Dr. Thompson finds herself in what Auburn University professors Christopher Basgier and Lydia Wilkes describe as an ‘unsettled middle’—neither fully embracing nor refusing the technology, but engaging with it critically. Her students, she has found, often end up in the same uncertain space. If students are going to continue encountering these tools throughout their lives, ignoring that reality does them no favors. Her responsibility is to help them develop the judgment to decide when a shortcut is strategic and when it undermines their own thinking.
Dr. Thompson believes that the role of a writing professor in the age of AI is not to ban the technology but to guide students in understanding when to struggle and when to use AI as a tool. This approach is not about rejecting progress but about ensuring that students develop the critical thinking skills necessary for a world where AI is increasingly prevalent.
Comments
No comments yet
Be the first to share your thoughts