by Rob Lentz

SUNN Post Exclusive

Generative AI, such as ChatGPT, has transitioned from novelty to norm in the classroom for students this year. However, educators are also testing the benefits of AI for themselves. The New York Times reported earlier this month that former Northeastern University student Ella Stapleton has demanded her tuition back from a class where the professor seemed to use AI to write his syllabus while demanding students not use AI on their assignments.

This pedagogical dilemma has no clear playbook. Some professors are integrating AI into coursework, treating it as a tool students must learn to navigate. Others are tightening academic integrity policies, banning AI use outright with repercussions similar to those in cases of plagiarism. As higher education adapts, one can look back at examples of innovations like the calculator, which professors protested mightily before accepting, or Wikipedia; both are now commonplace references in classrooms from early education to graduate levels. 

Dr. Tracy Mitrano, a visiting professor at Cornell University, uses ChatGPT to increase her efficiency when creating supplementary content for students. In her class “Law, Policy, and Politics of AI” taught in the Spring of 2025, she created a high-level outline of international law using ChatGPT to support a lecture. This process of prompting and reviewing AI to create an outline took her less than an hour, instead of the many hours it might otherwise have taken to prepare for just 30 minutes of class time, she said. However, she finds it important to not let AI determine her perspectives on the core content of the class, as that is what she calls the “fun of teaching.”

“I would never abdicate my responsibility as an instructor to hand over core insights, information, or themes,” Mitrano wrote in a message to The SUNN Post. “But as material in support of the core, AI did a perfectly reasonable job of what I needed to complement  my 1.5 hour lecture.” 

When she uses AI to create content, Mitrano cites it directly on the slide deck, as a way of modeling for her students how they should cite AI in their research. Since 2023, Mitrano thinks every one of her students has used AI in some capacity to help with their classwork — from studying, to researching, to writing. However, she doesn’t see AI use in her classroom as a problem.

“The mistaken assumption that the student made to rely on AI to satisfy the requirement of the assignment is the problem,” Mitrano said. “They earn a failing grade not because they used AI, but because they failed to do the assignment as explicitly directed and because their answer lacked requisite resources, depth and/or critical thinking.”

Looking ahead, Mitrano calls generative AI both a “challenge” to pedagogy, but at the same time “revolutionary.”

“Educators as well as students must come to understand and appreciate its profound capabilities, for better or worse,” Mitrano said.

Dr. Liza Long, an English professor at College of Western Idaho, says that AI has “transformed” her pedagogy. Since 2023 she has integrated generative AI into her classes and grading and has even written a textbook with ChatGPT as a coauthor. This past spring, Long taught her newest class, “Writing in the Age of Artificial Intelligence,,” which she created with the help of Claude.ai, another “large language” model similar in ability to ChatGPT. She requires students to use AI to get through the hefty amount of work that the course involves, which includes over 70 pages of writing.

“I also start every course with an AI literacy lesson where I explain to students that ChatGPT is not Google search,” wrote Long in a message to The SUNN Post. “It should be used to augment learning, not to replace it. Friction is necessary and even delicious in a learning context.”

Despite her pervasive use of AI in the classroom, Long recognizes that it is not a “neutral” tool. She cites algorithmic bias, climate change, job displacement, and the replacement of human labor that “disproportionately affects the Global South” as a few of her causes for concern. 

Long says professors who are resisting the new technology have “good reasons for doing so,” but they need to be informed about it.

“In order to resist, they need to understand the technology, and this means fingers on keys,” Long said. “AI is not going away, and students want and need our guidance.”

Marc Watkins, who runs the AI Institute for Teachers and is an Assistant Director of Academic Innovation at the University of Mississippi, says that he generally doesn’t use AI in his classroom, but has been actively involved in building curriculum for the Ole Miss community on the topic. 

“We started the institute in June of 2023 because we knew that generative AI tools like ChatGPT were going to disrupt education,” wrote Watkins in a message to The SUNN Post. “I’ve had to rebuild the curriculum for it each time just to keep up to date about AI developments.” 

As AI’s role in higher education has evolved, he has been an advocate for open disclosure of its use. 

“We should take the time to create thoughtful standards around how we want to use this technology and model that behavior consistently with one another,” wrote Watkins. “Think of it as a social contract around AI usage. If I use it, I disclose it. I expect the same from anyone else.”

Dr. Nancy Darling, professor of psychology at Oberlin College and Conservatory, sees AI as a valuable learning tool for her students and herself, especially while coding, as they can work faster and with less frustration. In psychology, this is called “scaffolding.”

“You learn best when you’re working with someone who helps you do more than you could on your own,” Darling wrote to The SUNN Post. “The difference between what you can do alone and what you can do with help is called the ‘zone of proximal development.’ You learn best within that zone with effective scaffolding. ChatGPT can provide effective scaffolding.”

While Darling uses AI as a teaching aide and has colleagues who openly use it to grade assignments, she says others find it “unethical.” 

“They believe it was built off of stolen and plagiarized work, and thus all products of it are, essentially, stolen goods. They won’t touch it,” Darling wrote. “Some faculty think the enormous energy usage of AI makes use of it immoral for environmental reasons.”

Dr. Clancy Martin, professor of philosophy at University of Missouri at Kansas City, who is helping rebind.ai with their development and content, thinks an honest and open approach to technology in his classroom makes students more engaged.

Martin likens the arrival of generative AI in education to many technological innovations of the past. He thinks the key way to adapt is to be thoughtful as AI is integrated into the classroom.

“I do not regard it as a threat, any more or less than Microsoft Word was a threat,” Martin wrote to The SUNN Post. “People were once opposed to spell check. Now they don’t use it enough.”