AI cheating in the classroom shows the need for a revolution in teaching
I've been involved with the teaching profession for decades, as a college professor, high school lecturer, professional trainer, writer, and as a parent. I've also tried to live the philosophy of always be learning in my own life. Honestly, why would anyone stop seeking new knowledge in our ever-changing world?
Knowledge can make you uncomfortable, because it can force you to rethink your beliefs and values or to revise your worldview as you learn something new. But that's the foundation of an evolving - and growing - society. This is also why I study world history, because the more I learn about the journey of civilization, the more I understand the why of contemporary world events.
More to the point, I believe that everyone is curious about some aspects of their life, whether it's studying the lineup for a favorite sports team, trying to decipher a newly proposed bill in Congress, or expanding one's knowledge of a group or place.
The cheating thing
If it's true that students truly possess curious minds, why do they cheat? I teach classes at the University of Denver, and like all institutions of higher education, we have an Honor Code that explicitly defines and prohibits plagiarism and other forms of cheating. It's been recently updated to detail what aspects of generative AI are considered cheating.
The vast majority of students in my classes either wouldn't dream of cheating, are afraid they'll get caught, or are so subtle in their approach that it's undetectable.
However, one or two students every quarter are lazy and submit content produced by AI rather than their own work. These submissions stick out like the proverbial sore thumb; every other student is writing about how "I think that..." while the cheater has "The word XX is defined as..." or something similar.
The universal defense is that they're using writing tools to improve their prose, but I remain skeptical. After all, when I invite them to resubmit the assignment sans AI, they have completely different ideas and entirely acceptable prose.
If it's worth doing...
Underlying my contemplation of this situation is the adage If it's worth doing, it's worth doing well. The problem here isn't the existence of the tools, or even the questionable companies advertising "undetectable AI for students," but the commitment of the students to the learning process itself.
In K-12, students are more-or-less stuck trying to get through all of their classes to earn a high school diploma and then march dutifully forward into the workforce. But no one is forced to go to college and there's lots of data to show that a college degree is less valuable over the span of a career than it has been at any time in the last 50 years.
Still, there are plenty of parents and young adults who are convinced that a degree will open up opportunities that would otherwise be unavailable. That's probably still true at this point, but will that be the case in a decade? 25 years?
More importantly, what's the point of attending college? To answer my own rhetorical question, it's to learn, to gain knowledge, skills, and experience, both in a field of study and in interpersonal interactions. With precious few exceptions, we're all going to work, live, and play with others, so learning about both your subject and how to work well with others is critical.
What's the solution?
Students who are using tools such as ChatGPT, Claude, and even Grammarly (which has a heavy AI component now) to replace learning are shortchanging themselves and adversely impacting their classmates. AI-generated, top-of-the-class assignments skew the curve.
The key word in that sentence, though, is replace. What I suggest instead is that it's the responsibility of the teaching profession to teach students how to use AI tools to enhance their learning process instead.
Bored with your Foundations of Modern Philosophy class and don't really understand the argument against epistemological relativism? What if you could instead argue the pros and cons with an AI-powered avatar, learning about basic philosophical concepts as they apply to you and your life and interests?
Try it: Ask Perplexity or ChatGPT to "apply epistemological relativism to journalism and how it relates to whether a given article should be published as content or an op-ed." Then ask it to apply the philosophy to Game of Thrones. Now extrapolate that to every student and every subject in every class. Why would anyone cheat when they can apply new ideas and concepts to their favorite topics?
Foundational to this question is my belief that the vast majority of students are open to learning new and interesting things. The challenge of academia is to frame the topics teachers want to share with themes and concepts that resonate with these students. That's something we've been wrestling with forever, and finally, we have some tools that can really power change.
The new question isn't how to detect or ban AI, but how to incorporate it into education to make everything more fun, engaging, and lively. To use it to build a culture of learning and a more educated populace. Just imagine...