Instructors across campus continue to adjust to students' increasing use of artificial intelligence (AI). It impacts departments differently but spares no instructor from having to think about how AI alters the delivery and testing of knowledge.
Faculty members Christine Denison and Michael Bugeja presented teaching talks on AI during CELT's spring series.
Four instructors talked with Inside about chatbots and how they try to limit their use or use them to enhance learning.
Accounting associate professor Christine Denison puts many of her class materials and assignments through ChatGPT before class begins to identify potential responses students may receive when they use it and determine what information it gives users.
"Rather than just think about how students can cheat with it, it's what can students do with it," said Denison, who uses the chatbot for teaching and research applications. "It just takes experimentation."
One of her courses this fall focuses on communication, critical thinking and ethics. She will allow students to use ChatGPT, but only if they cite it in their work. It allows Denison to see how students use it and if they are developing critical skills they'll need after graduation.
To avoid cheating, Denison advocates flipping the classroom so lectures are watched outside of class, but work and discussion are during the class meeting. She recommends using ChatGPT in class to generate information before discussing what is right and wrong with the result.
"The instructor can get a good idea of students' knowledge through their comments on what ChatGPT did," Denison said.
She plans several ungraded Canvas quizzes to gauge students' initial knowledge on various subjects. With each assignment Denison asks, "Do I care if they use AI?" The key is to have students, not ChatGPT, achieve the learning objectives in each course, she said.
Know the facts
Greenlee School of Journalism and Communication Distinguished Professor Michael Bugeja sees the impact of AI changing how and what students are taught. Fact checking is vital because chatbots can "hallucinate" or provide false or made-up information. Bugeja said he tells students he won't punish them through grades -- but paints a picture of their future to get the message across.
"Those who embrace critical thinking and use chatbots sparingly will be the supervisors of their classmates who did not know how to fact check," he said. "If they want to use ChatGPT in my class, they have to cite it, and if there is an AI hallucination in there, I will call them on it."
He said it's important to make students responsible for everything from attendance to cheating because it will serve them best after graduation. To help combat cheating, Bugeja has students critique his own writing to learn fact checking.
Bugeja, who recognizes the complexity and changing use of AI, said he sees an opportunity for the provost's office, Center for Excellence in Learning and Teaching, Faculty Senate and judiciaries to work together on a statement for the use of AI chatbots.
"We are at a juncture where the norms are changing and we need guidance on what is and is not acceptable in the classroom," Bugeja said. "I would envision a statement that would define proper and improper use."
Bugeja believes every syllabus should have a statement on AI, and developed one for his media ethics course that says, in part:
"We will not be monitoring the use of ChatGPT. But you should know that your instructor’s expertise is technical in nature, and he is quick to identify AI hallucinations. Language models generate false information that is easy to fact check. That said, if you use ChatGPT to help you write a discussion-board response, you will be cheating yourself of the critical thinking that is a hallmark of this class. Chatbots can inspire you. That is fine. But you should write the content."
Denison agreed on the need for a syllabus statement but emphasized the importance of taking time to talk with students throughout a semester about AI use. In instances of suspected use, Denison said she talks with the student and asks them to demonstrate their knowledge during the conversation.
"I had a couple of instances last semester where I had a question, but after talking to one of the students it became obvious she knew the material," Denison said.
Can they code?
ChatGPT's ability to generate computer code would seem to be an issue for computer science instructors, but computer science assistant professor Qi Li said it's user beware.
"If I am just asking for source code, then ChatGPT can find that online, but the strategy is in how you ask the question because you can fool ChatGPT," said Li, who is conducting research on the use of the chatbot. "You can use the same algorithm or ask for the same code, but write it a different way and ChatGPT may fail."
Li did not have rules for chatbots during her spring courses but will allow their use this fall with the understanding that wrong answers will have to be corrected by the student. She sees the need for students to learn how to use chatbots effectively because interacting with them successfully is a process of trial and error.
Students still need to know how to code even if they use ChatGPT to ensure accuracy. Li said students could feed a chatbot more information to help it determine the correct answer. That requires an understanding of classroom lessons and helps them develop communication skills.
"You have to know how to come up with the answer because ChatGPT will give an answer that seems very plausible, but often is wrong because it can't reason and is just mimicking intelligent thinking," she said.
Opportunities and drawbacks
Computer science assistant professor Matt Tancreti taught a freshman computer programming course in the spring and will teach a senior-level course this fall. The use of AI in each illustrates the potential and drawbacks of a chatbot.
In the freshman course, Tancreti typically tested students through small puzzle problems, but because students can simply copy and paste the question into ChatGPT, he flipped the process.
"We gave them the answer that ChatGPT gave, but it doesn't solve it perfectly and they have to test the code and fix it, if needed," he said. "Cheating tools and ways of cheating have been around for a long time, but what makes this different is how much it lowers the bar for ease of use."
This fall, Tancreti is teaching an analysis and design course and will welcome the use of a chatbot to do more of the "grunt work" and give students more time to develop their ideas.
Tancreti, like many instructors, said he spends more time coming up with novel problems to get around the chatbots. But catering to AI may mean students miss out on learning classic examples.
"Chatbots can take away from the class because now learning is based on more creative problems that are not the classic problems that have been used year after year to teach the subject," he said. "When you go off into industry, you can talk to your colleagues in a common language, which may now be lost."
Tancreti has seen other issues develop in the short time since ChatGPT arrived, including equal access. While older versions of ChatGPT remain free (for now), new versions come with a subscription fee.