Abstract
Mental health chat bots have emerged as accessible tools for digital psychological support. However, many systems face limitations in conversational design, particularly in questioning strategies and interactional responsiveness. This review synthesises findings from 12 peer-reviewed studies published between 2015 and 2025, identifying common design issues including rigid dialogue structures, limited contextual awareness, and insufficient emotional engagement. Thematic analysis revealsaconsistentlackofstructuredapproachestoguidethedesignofempathicandcontext-aware interactions. In response, a conceptual framework is proposed to support the development of more adaptive and emotionally intelligent chatbot dialogues. The model outlines a four-stage design process with targeted principles and methods, offering a structured pathway from problem identification to implementation. This research highlights key gaps in existing chatbot design for mental health contexts and offers a practical foundation for improving user experience and therapeutic relevance. The framework may in form future efforts to design conversational agents that are better aligned with user needs and emotional dynamics.
Keywords
Mental health chatbots; Empathic interaction; Questioning strategies; Context-aware design; Conversationaluserinterfaces; Designframeworks
DOI
https://doi.org/10.21606/iasdr.2025.671
Citation
Huang, T.,and Liu, W.(2025) Reframing Questioning and Interaction Design in Mental Health Chatbots: Toward an Empathic and Contextually Responsive Design Framework, in Chang, C.-Y., and Hsu, Y. (eds.), IASDR 2025: Design Next, 02-05 December, Taiwan. https://doi.org/10.21606/iasdr.2025.671
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Conference Track
Track 9 - Healthcare Design
Reframing Questioning and Interaction Design in Mental Health Chatbots: Toward an Empathic and Contextually Responsive Design Framework
Mental health chat bots have emerged as accessible tools for digital psychological support. However, many systems face limitations in conversational design, particularly in questioning strategies and interactional responsiveness. This review synthesises findings from 12 peer-reviewed studies published between 2015 and 2025, identifying common design issues including rigid dialogue structures, limited contextual awareness, and insufficient emotional engagement. Thematic analysis revealsaconsistentlackofstructuredapproachestoguidethedesignofempathicandcontext-aware interactions. In response, a conceptual framework is proposed to support the development of more adaptive and emotionally intelligent chatbot dialogues. The model outlines a four-stage design process with targeted principles and methods, offering a structured pathway from problem identification to implementation. This research highlights key gaps in existing chatbot design for mental health contexts and offers a practical foundation for improving user experience and therapeutic relevance. The framework may in form future efforts to design conversational agents that are better aligned with user needs and emotional dynamics.