18 Reading Scientific Literature with AI (Intro Biochemistry)
Author: Amanda Vines Garrett, Assistant Professor of Instruction, Molecular Biosciences and Biology Instructional Office, The University of Texas at Austin Amanda Vines Garrett is an assistant professor of instruction in MBS & BIO. She teaches a biochemistry lab (BCH 219L) and a biotech lab skills course (BCH 315T). Amanda has a B.S. in cell biology from Dallas Baptist University and a Ph.D. in cell biology from Yale. To see Amanda’s full bio, click here. Cite this Chapter: Vines-Garrett, A. (2025). Reading Scientific Literature with AI. In K. Procko, E. N. Smith, and K. D. Patterson (Eds.), Enhancing STEM Higher Education through Artificial Intelligence. The University of Texas at Austin. https://doi.org/10.15781/7dv5-g279 |
Description of resource(s):
This OER equips students to use AI to navigate scientific literature and consider the accuracy/efficacy of AI for this task. The main activity can be performed either in class/lab or asynchronously, but is best performed in a context where discussion is encouraged and assistance can be provided. The activity also assumes that the students have previously performed a literature search for scientific articles using library resources/databases at their institution.
Links to Resources:
Why I implemented this:
Students in my sophomore biochemistry lab have a range of research experience; some do research as freshmen, while others have not done any research. My goals in this project were: (1) to equip students to better understand how to read scientific articles, since this would prepare them for an upcoming literature-based research project, and (2) to analyze the effectiveness and/or accuracy of AI for this process.
I started by investigating tools that could allow students to interact directly with scientific articles by asking AI questions like “what is in Figure 2?”, “can you give me the abstract?”, etc. I settled on Microsoft Copilot since it is a UT Austin-approved tool and a PDF can be uploaded directly. It also provides links to its internet sources and sometimes provides follow-up questions, which helps students dive deeper into their research topic.
From there, I wrote an activity in which students attempt to:
- Understand main components of the article through its abstract
- Compare their summaries of information to AI’s summaries
- Glean information from article sections that students often struggle to interpret (e.g., Methods and Results), and
- Critique the accuracy and/or usefulness of Copilot’s answers.
My main takeaways:
Overall, students seemed to benefit from the activity. Students’ prior experience with research and/or AI influenced their thoughts on AI’s accuracy/usefulness, and their gains in learning to read scientific articles. Students with less research experience seemed to have larger gains in science literacy; those with more research experience were more critical of AI and found the activity overall less useful, especially if they had previously read the articles they were using for the activity.
What else should I consider?
Preparation: The following would need to be provided by instructor and/or students:
- Research topic: e.g., for an upcoming research paper or presentation later in the course, or just diving deeper into a topic from earlier in the course
- Article PDFs: 2 to 4 scientific research articles and reviews (students perform a literature search prior to activity, or instructor provides relevant article PDFs for a particular topic)
- Computer with internet & word processor: For each student/group, a computer with internet, including access to Adobe PDF Compressor, and a word processor/access to Google Sheets
- Access to Copilot: we have institutional access, or a free version can be used
- Student permission to use graded assignments (for certain uses): If student submissions will be used in other contexts such as sharing/discussing results with others, written permission to use graded work needs to be obtained from students ahead of time to fulfill FERPA requirements
Timing: In-class activity takes about 2 hours; it could be split into two ~1 hr sessions. I provided asynchronous resources for students to review ahead of time. Alternatively, introductory information could be provided and/or a discussion about AI could be led in ~30 minutes or more of class time ahead of the activity. The students also completed reflective questions after the activity and submitted their full prompts and responses, which took them ~30 min – 1 hr after lab.
Context: This activity is most suitable for undergraduate science students who are new to research and/or reading scientific articles. Students who have prior experience with either AI or research, but not both, would also benefit somewhat. It could also be adapted for advanced high school students.
Adaptations:
- This activity could be adapted for students with a slightly lower or higher level of research background by subtracting portions of the activity and/or requiring more critical output from the students about AI’s responses, phrasing prompts in more specific ways, etc. For example, specifying “I am a sophomore biochemistry student” could better adapt the AI responses to the skill level of the students.
- The activity is also adaptable to varying class sizes, class time lengths, individual vs. group work, etc.
- The skill of learning to read scientific articles is valuable in research as well, so the activity could also be part of a CURE (course-based undergraduate research experience), summer research program, or training materials for faculty lab research. The main two adaptations I would suggest for that to be beneficial are (1) customize the research topics to the research of the lab/course. (2) implement more direct coaching by a mentor and/or follow-up writing assignments without AI to ensure that the student is learning accurate and sufficiently detailed information.
Potential Pitfalls: Other AI models may be effective for this activity, but can pose security risks. I originally considered using ChatPDF which visually displays an uploaded article. However, data security issues precluded its use. Also, be flexible! Copilot had updates between the activity’s development and implementation, affecting how Copilot responded (e.g., tighter copyright restrictions with articles). Regular testing and flexible AI response expectations are recommended. Finally, students who had not previously read the articles in depth got more out of the activity. When selecting articles, I would recommend encouraging students to not read them much ahead of time, other than ensuring they are suitable for the research topic.