Generative AI in the Classroom FAQs
Below are common questions faculty have with regards to the use of generative AI in the classroom. You can find information related to IU policy, as well as tips and best practices.
What existing AI policies does IU have?
Indiana University has approved the use of Microsoft Copilot for faculty, staff, and students aged 18 and older. This is the only generative AI tool approved for use that involves university internal data. Please note you must be logged in through your institution. For information, please see IU’s page on Acceptable Uses of Generative AI and the Policies and Guidance teaching resource. IU has also established an AI Taskforce that has developed AI recommendations.
What do I do if I suspect a student has used AI unethically?
We recommend adopting a learner-centered approach when talking with students about their use of AI. To learn more about discussing the use of AI with your students, visit the CITL’s page on How to Productively Address AI-Generated Text in Your Classroom.
How can I use AI as an instructor?
There are many ways that you as an instructor can utilize AI tools to help generate productive content for your research and teaching from creating rubrics and course learning objectives to increasing the relevancy of your content to better represent the students that make up your class. As you integrate these tools into your workflow, make sure to consider IU’s Acceptable Uses of Generative AI.
Does IU have AI detection software?
IU has not approved any IU detection tools for use. AI detection tools are unreliable, which can cause students to be falsely accused of academic misconduct. In addition, studies show that AI detection software targets specific demographics, especially non-native English writers.
How can instructors effectively use AI in the classroom while still promoting critical thinking?
It is essential to ensure that students use AI tools to help develop and hone critical thinking skills as opposed to circumvent learning. Here are a few resources to help you navigate this challenge:
Are there examples of syllabus language for AI use?
It is important to check with your department to ensure your syllabus policy aligns with any policies and/or recommendations your department may have. In addition, make sure to stay up to date on any policies IU has with regards to the use of generative AI. If you are looking for examples of AI policies used in syllabi from a variety of universities, see Tracy Moore’s collection of University Policies on generative AI.
Are students just using these tools to cheat?
Used ethically and effectively, generative AI tools have the potential to further student learning. However, many faculty worry that students will use these tools in ways that hinder rather than help their development. It may surprise you to learn that early research indicates AI has not increased cheating. We recommend faculty adopt a proactive rather than punitive approach to the use of AI. Students who understand the purpose of their assignments and are invested in their own learning are less likely to over rely on AI tools.
How do I create an effective prompt?
Generative AI outputs are only as strong as the prompts they are given. Several factors make a strong prompt, including specificity, plenty of context, and clear roles (i.e., ensuring the AI understands who the author and audience is). For help creating prompts, refer to the AI for Education’s Prompt Library.
Can I make my students use AI?
Microsoft Copilot is the only tool approved for use by students aged 18 years and older. Because this tool is not available to all students, instructors cannot require that students use AI tools. Many companies take user inputs to continue training their models, which means that there is no guarantee that students’ data is protected. Always advise students to exercise caution and consider the data privacy risks associated with generative AI use. Should you choose to use AI in your class, make sure to have a disclaimer regarding privacy and ensure there are alternative ways for students to engage should they not wish to use AI.
Are there concerns about bias in AI models and outputs?
While AI tools cannot have intentional biases, there are inherent biases within the algorithms as well as biases within the data on which AI models are trained. After all, the data on which AI determines its outputs is created by humans. It can also be difficult to tell where an AI tool gets its information, making it challenging to easily and effectively verify outputs without further research. It is imperative to verify AI outputs and always consider the limitations of these tools.
Are there concerns about privacy and intellectual property when it comes to AI tools?
Yes, there are concerns regarding privacy and intellectual property for generative AI users. This is why using IU authentication to access Microsoft Copilot is the only university-approved use of AI for internal data. Companies often use user data to train their models, which creates vulnerabilities. Make sure to read user agreements before using AI tools and follow IU generative AI policy whenever you use these tools.