Library provides AI best practices guide for campus learners
November 19, 2025
Author: Jeff Budlong
After numerous requests from faculty and staff for information and guidance on artificial intelligence (AI), University Library faculty and staff responded by publishing a LibGuide.
The guide, which has been live for more than a month and will receive periodic updates, covers AI basics, including definitions, how and when to use AI appropriately and a five-step process for evaluating information received from an AI source. Instruction librarians compiled data from numerous organizations, tapped experts on campus and incorporated several videos and charts to add interactive elements. References also are provided for users who want to learn more about any of the topics.
"We try to steer away from teaching to a tool and teach more broadly about information literacy," said instruction librarian Kate Garretson.
Questions to ask
The guide provides three questions to ask whenever you involve AI in your work:
How are you using it?
Why are you using it?
Should you be using it?
"A key question for students is, are you asking AI to do what you are supposed to be practicing for your learning -- and robbing yourself of that?" Garretson said. "For staff, it can be more: 'Am I using the correct tool for the task?' but still understanding that you have to verify everything AI produces."
To check information found online, the guide provides the acronym "SIFT:"
Stop: Recognize your own biases, beliefs and potential blind spots and acknowledge they will affect your judgment.
Investigate the source: Who made it and what are their credentials?
Find better coverage: What credible sources cover this topic or issue?
Trace it back: Can you trace the claims back to the original context? Who first wrote about the topic and when was it first published?
"One of the points we stress in the guide is to not just think about what you can do with AI, but more should it be used and what questions need to be considered if it is used," said instructional design librarian Yen Verhoeven. "Using AI at a research institution can be difficult. If it's determined that something was plagiarized or incorrect, the blame isn't on AI, it's on the researcher."
Citing AI
Most scholarly journals and style manual writers agree that AI tools should not be considered authors, but still need to be credited when used or quoted, Garretson said. Instructors at ISU may have different standards for students citing and disclosing AI use. The guide provides three pieces of information to include:
The name of the AI tool used (including the version, if applicable)
How the tool was used
A statement taking responsibility for the assignment
"Researchers should always check with the journal they are applying to or the style manual their field prefers for citation and disclosure statements," Garretson said. "I always tell people that over disclosure is not a problem with AI use. Instructors should be explicit in what they want from students and give examples if possible."
Protect your information
The library's AI guide also can be found on Iowa State's central AI website, hosted by Information Technology Services. It's important for all members of the Iowa State community to understand what university information can and can't be used with AI tools. Confidential data classified as "moderate" or above by university standards may not be entered into any generative AI product unless the confidential data has been approved for such use in accordance with ISU's data classification policy.
ITS guidance directs faculty, staff and students to use Copilot or Google Gemini because prompts and responses are not shared or used to train AI foundation models. The key is that users log in with their ISU email and password.
Verhoeven said one of the issues with AI is it's still new and many of the companies leading its development are startups.
"Data can go to a third party and that needs to be vetted," she said. "People can sign on to one set of terms and conditions, and if the company is sold, those terms might change. If someone puts their research into an AI platform, they need to know what happens to that information."