Abstract
With the growing prevalence of social robots, the communication of emotions in human–robot interaction (HRI) has attracted increasing scholarly attention. As a primary channel for emotional expression, facial expressions play a critical role in shaping the accuracy and perceived warmth of interactions. However, current design practices for robotic facial expressions largely rely on subjective intuition, lacking systematic and data-driven strategies. This study employs an experimental method to quantify the muscle movements associated with six basic human emotions - anger, disgust, fear, happiness, sadness, and astonishment - and to develop static facial expression design strategies tailored for screen-based social robots. 20 participants aged between 20 to 30 were recruited. Facial muscle activation data were collected using ZIG SIM Pro, focusing on key facial regions. Sixteen facial action units were quantitatively acquired and analyzed. Based on these findings, the study proposes "dominant and subdominant muscle group" configuration strategies. By incorporating dynamic muscle data into the design process, this research establishes a structured and parametric framework for the graphical design of robotic facial expressions. Theoretically, it addresses the current gap in quantitative foundations for emotional HRI. Practically, it offers robust strategic guidance for the development of social robot interfaces. As a result, this research aligns with the theme “Design, Art & Technology” of IASDR 2025.
Keywords
Social robots; Facial expression design; Human–robot interaction
DOI
https://doi.org/10.21606/iasdr.2025.1124
Citation
Zhu, J.,and Zhang, Q.(2025) Research on Facial Expression Design Strategies for Social Robots, in Chang, C.-Y., and Hsu, Y. (eds.), IASDR 2025: Design Next, 02-05 December, Taiwan. https://doi.org/10.21606/iasdr.2025.1124
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Conference Track
Track 3 - Design, Art & Technology
Research on Facial Expression Design Strategies for Social Robots
With the growing prevalence of social robots, the communication of emotions in human–robot interaction (HRI) has attracted increasing scholarly attention. As a primary channel for emotional expression, facial expressions play a critical role in shaping the accuracy and perceived warmth of interactions. However, current design practices for robotic facial expressions largely rely on subjective intuition, lacking systematic and data-driven strategies. This study employs an experimental method to quantify the muscle movements associated with six basic human emotions - anger, disgust, fear, happiness, sadness, and astonishment - and to develop static facial expression design strategies tailored for screen-based social robots. 20 participants aged between 20 to 30 were recruited. Facial muscle activation data were collected using ZIG SIM Pro, focusing on key facial regions. Sixteen facial action units were quantitatively acquired and analyzed. Based on these findings, the study proposes "dominant and subdominant muscle group" configuration strategies. By incorporating dynamic muscle data into the design process, this research establishes a structured and parametric framework for the graphical design of robotic facial expressions. Theoretically, it addresses the current gap in quantitative foundations for emotional HRI. Practically, it offers robust strategic guidance for the development of social robot interfaces. As a result, this research aligns with the theme “Design, Art & Technology” of IASDR 2025.