Abstract
This study explores the development, implementation, and initial testing of an AI reviewer tool leveraging generative AI (GenAI) to enhance learning experiences in an introduction to human-centered design (HCD) course. The tool was designed using rubric-driven prompts tailored for retrieval-augmented generation (RAG), ensuring feedback that is relevant, consistent, and actionable. To evaluate its effectiveness, the AI reviewer tool was tested in the Fall 2024 course, where it generated feedback on student design project artifacts, including documentation and presentations. Students and the instructor were later separately engaged in reflective discussions to assess the feedback’s clarity, usefulness, and impact on learning. The analysis of these reflections revealed the tool's potential to guide iterative design processes, foster deeper engagement with course concepts, and complement traditional instructional feedback. Findings will inform future implementations, highlighting the role of AI-driven feedback in enhancing educational practices and preparing students for real-world design challenges.
DOI
https://doi.org/10.21606/drslxd.2025.171
Citation
Saini, A., Shehab, S., Cope, W., Kalantzis, M., de Castro, V.C.,and Schultz, A.(2025) Exploring the Use of GenAI Feedback during Design Projects, in Clemente, V., Gomes, G., Reis, M., Félix, S., Ala, S., Jones, D. (eds.), Learn X Design 2025, 22-24 September 2025, Aveiro, Portugal. https://doi.org/10.21606/drslxd.2025.171
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Conference Track
Full Paper
Exploring the Use of GenAI Feedback during Design Projects
This study explores the development, implementation, and initial testing of an AI reviewer tool leveraging generative AI (GenAI) to enhance learning experiences in an introduction to human-centered design (HCD) course. The tool was designed using rubric-driven prompts tailored for retrieval-augmented generation (RAG), ensuring feedback that is relevant, consistent, and actionable. To evaluate its effectiveness, the AI reviewer tool was tested in the Fall 2024 course, where it generated feedback on student design project artifacts, including documentation and presentations. Students and the instructor were later separately engaged in reflective discussions to assess the feedback’s clarity, usefulness, and impact on learning. The analysis of these reflections revealed the tool's potential to guide iterative design processes, foster deeper engagement with course concepts, and complement traditional instructional feedback. Findings will inform future implementations, highlighting the role of AI-driven feedback in enhancing educational practices and preparing students for real-world design challenges.