17 Visualizing Primary Literature Search with AI (Biochemistry)
Author: Kristen Procko, Department of Molecular Biosciences and Biology Instructional Office, The University of Texas at Austin Kristen “KP” Procko is the Biochemistry Education Fellow for the Department of Molecular Biosciences, coordinating curriculum redesign and assessment efforts. To see KP’s full bio, click here. |
Author: Haleigh Bond, Department of Molecular Biosciences and Biology Instructional Office, The University of Texas at Austin Haleigh Bond is a Biochemistry major with a minor in Forensic Science at the University of Texas at Austin. To see Haleigh’s full bio, click here. |
Author: Isabella Sibrian, Department of Molecular Biosciences and Biology Instructional Office, The University of Texas at Austin Isabella Sibrian, BSA, is a proud University of Texas at Austin alumnus (Neuroscience, 2022), who performed research to identify genes and protein pathways mediating alcohol-induced behaviors in Drosophila melanogaster. To see Isabella’s full bio, click here. Cite this Chapter: Procko, K. (2025). Visualizing Primary Literature Search with AI. In K. Procko, E. N. Smith, and K. D. Patterson (Eds.), Enhancing STEM Higher Education through Artificial Intelligence. The University of Texas at Austin. https://doi.org/10.15781/7dv5-g279 |
Description of resource(s):
This resource contains a guide for Research Rabbit, an AI program used to visualize literature search. The guide includes an example biochemistry course assignment and our lessons learned in a corresponding Instructor Reflection that can be easily adapted to any field. By analyzing the network of papers produced by Research Rabbit from the student work, we created an “output analysis” to help others interpret how the program produces the network. Finally, we include a summary of student reflections after using this tool.
Links to Resources:
Why I implemented this:
Conducting a Literature search can be a daunting task for students—traditional methods use keywords that return pages of text-based abstracts, which can overwhelm someone new to primary literature. To address this challenge, we explored whether Research Rabbit could improve students’ literature search experiences.
In Research Rabbit, a user enters one or more “seed papers” to generate a visual map of suggested papers that provides information about the citations. To learn more about Research Rabbit, we created a preliminary extra credit assignment during Spring of 2024. Using student insights from that assignment and the Research Rabbit FAQ, we crafted the classroom exercise and implementation guide. This resource highlights the program’s most relevant features for our use-case, alongside an Instructor Reflection to document the issues we encountered during implementation.
Research Rabbit does not disclose how its AI generates the network, prompting us to investigate variance in similar networks. Specifically, we analyzed the features of the suggested manuscripts, especially when only one seed paper varied—we present a quantitative analysis comparing the networks generated from similar searches. Additionally, we thematically coded open-ended student responses to examine their perceptions of traditional versus AI-assisted search methods. By collaboratively coding these responses, we gained a deeper understanding of student perceptions about the advantages and limits of each approach.
My main takeaways:
Students were tasked with finding a foundational paper using both traditional search and Research Rabbit (see Research Rabbit Guide and Assignment). While most students preferred Research Rabbit for this task (62%; see Quantitative analysis), the primary challenge they reported was navigating the unfamiliar interface. We believe a step-by-step protocol could address this issue (see Instructor Reflection). Our analysis found consistency in output for use in assignments: the same five seed papers gave identical networks; varying the fifth paper led to a network where about 30 of the top 50 papers differed (see Output Analysis). Interestingly, several student comments expressed suspicion about how the AI tool selects papers and discussed bias in AI.
What else should I consider?
Timing: We implemented this activity in course discussion sections, an teaching assistant-led hour (outside of the main lecture time) where students engage in active learning. Because our students had questions about using the Research Rabbit interface, we recommend a discussion-style setting or some dedicated class time, for students to explore the interface and ask questions.
Adaptations: To effectively integrate Research Rabbit into academic or research settings, students must first be trained in traditional research methods to evaluate the quality and relevance of academic papers. Professors or librarians can offer workshops on using databases like PubMed, focusing on key skills such as understanding citations, identifying influential papers, and assessing research credibility.
The quality of Research Rabbit’s output relies on the input seed papers, which should be chosen based on factors like the principal investigator (PI), journal reputation, or citation impact. In our sample assignment, seed papers were selected primarily for their high citation counts, peer reviewed journals and direct relevance to the topic.
If students are tasked with selecting their own seed papers, structured training is critical. They should learn how to conduct traditional literature searches, including crafting effective search queries, tracking citation trails, and filtering for highly cited and relevant articles. For example, to find seed papers for our biochemistry assignment, students could use keywords like “hemoglobin structure” or “hemoglobin mutation” and focus on peer-reviewed articles or foundational studies frequently cited in the field.
Want to Learn More?
Research Rabbit has been featured in the news, often in articles describing other AI-based tools for research. Two recent articles are shown below:
- This Euro News article, “The best AI tools to power your academic research” describes a variety of AI tools and features Research Rabbit.
- This article from Beebom, “10 Best AI Tools for Research” is another recent collection of useful AI tools for research.
There are some scholarly articles about Research Rabbit as well. The following citation provides a detailed guide to the Research Rabbit menus, and walks the reader through a specific search, providing tables of the top citations when viewing in different ways (e.g., using the author search).
- Sharma, R., Gulati, S., Kaur, A., Sinhababu, A., & Chakravarty, R. (2022). Research discovery and visualization using ResearchRabbit: A use case of AI in libraries. COLLNET Journal of Scientometrics and Information Management, 16(2), 215–237. https://doi.org/10.1080/09737766.2022.2106167 [PDF Link]
Another manuscript took a constructionist approach to teaching literature search, having students create literature review-related products from activities performed using three AI tools. Research Rabbit was used to construct a network of the top 20 most related papers to student’s topics, and the authors used asynchronous YouTube video tutorials for students to self-train in using Research Rabbit.
- Cronjé, J. C. (2024, March). A Constructionist Approach to Learning with AI: An “Old” Solution to a “New” Problem?. In Future of Information and Communication Conference (pp. 13-22). Cham: Springer Nature Switzerland. https://link.springer.com/chapter/10.1007/978-3-031-53963-3_2