2 Navigating Changing University Policies
As university policies around AI began to evolve at The University of Texas at Austin, the Office of Academic Technology requested that AI in Education Grants only supported projects that clearly described transformational uses of AI. Faculty were provided the following descriptions, and the grant administrators guided applicants to enhance the transformational aspects of the work.
Transactional AI use was defined as prompting AI for an output without a critical analysis. Transformational AI use was described as integrating effective analysis, either by students or the faculty member. Below are some examples of transformational approaches:
- Classroom activities: student reflections on the accuracy/effectiveness of AI output, annotated work highlighting human vs. AI contributions
- Instructor tools: Comparison of output from an AI with instructor-generated work, comparison of the output from different AI tools
Limitations on AI Tools
The first cohort had a great deal of freedom in selecting AI tools to use in their projects, while the guidance to the second cohort was more restrictive. Applicants using generative AI were encouraged to primarily consider the use of University-contracted tools (e.g., Microsoft Copilot) for projects implemented during Fall of 2024. If the applicant found another tool better suited for their project, they could take steps to utilize that tool (see Appendix 2: University Policies), but the more complex process for non-contracted tools may have resulted in instructors using AI technologies that were not ideal for their project. For example, one grant recipient reported having to develop workarounds to file upload restrictions in the chapter Reading Scientific Literature with AI, which may not have been an issue with the originally proposed AI tool.
Protecting Privacy and Intellectual Property
During a project carried out in Spring of 2024, one grant recipient noted that some students did not wish to use AI tools in their work. Thus, we began recommending instructors plan for alternative assignments that did not require AI. Accordingly, options that allow students to choose whether or not to use an AI tool start to appear among our Fall 2024 cohort (see Project Outcomes and Major Themes). Some students were open to using AI tools, but not entering personal information into them. Therefore, some instructors provided mock materials to allow students to gain experience with the tools without the expressed privacy concerns.
For projects that required the entry of an instructor’s intellectual property (e.g., exam questions, see AI-Assisted Exam Creation) or student quotes (see: Summarizing Student Feedback with AI), we purchased subscriptions for tools with additional security. Namely, we procured the enterprise versions of Microsoft Copilot and ChatGPT to support projects that required the entry of sensitive data. For the second cohort, we also guided faculty and students to disable settings that allow the AI tools to use the prompts entered to train the model further (see Appendix 2: University Policies). We recognize that navigating this new territory is complicated, and intend these sections to help instructors begin to make decisions related to policy and privacy, while balancing the selection of the most effective AI tools for their projects.