Abstract

With the growing prevalence of social robots, the communication of emotions in human–robot interaction (HRI) has attracted increasing scholarly attention. As a primary channel for emotional expression, facial expressions play a critical role in shaping the accuracy and perceived warmth of interactions. However, current design practices for robotic facial expressions largely rely on subjective intuition, lacking systematic and data-driven strategies. This study employs an experimental method to quantify the muscle movements associated with six basic human emotions - anger, disgust, fear, happiness, sadness, and astonishment - and to develop static facial expression design strategies tailored for screen-based social robots. 20 participants aged between 20 to 30 were recruited. Facial muscle activation data were collected using ZIG SIM Pro, focusing on key facial regions. Sixteen facial action units were quantitatively acquired and analyzed. Based on these findings, the study proposes "dominant and subdominant muscle group" configuration strategies. By incorporating dynamic muscle data into the design process, this research establishes a structured and parametric framework for the graphical design of robotic facial expressions. Theoretically, it addresses the current gap in quantitative foundations for emotional HRI. Practically, it offers robust strategic guidance for the development of social robot interfaces. As a result, this research aligns with the theme “Design, Art & Technology” of IASDR 2025.

Keywords

Social robots; Facial expression design; Human–robot interaction

Creative Commons License

Creative Commons Attribution-NonCommercial 4.0 International License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License

Conference Track

Track 3 - Design, Art & Technology

Share

COinS
 
Dec 2nd, 9:00 AM Dec 5th, 5:00 PM

Research on Facial Expression Design Strategies for Social Robots

With the growing prevalence of social robots, the communication of emotions in human–robot interaction (HRI) has attracted increasing scholarly attention. As a primary channel for emotional expression, facial expressions play a critical role in shaping the accuracy and perceived warmth of interactions. However, current design practices for robotic facial expressions largely rely on subjective intuition, lacking systematic and data-driven strategies. This study employs an experimental method to quantify the muscle movements associated with six basic human emotions - anger, disgust, fear, happiness, sadness, and astonishment - and to develop static facial expression design strategies tailored for screen-based social robots. 20 participants aged between 20 to 30 were recruited. Facial muscle activation data were collected using ZIG SIM Pro, focusing on key facial regions. Sixteen facial action units were quantitatively acquired and analyzed. Based on these findings, the study proposes "dominant and subdominant muscle group" configuration strategies. By incorporating dynamic muscle data into the design process, this research establishes a structured and parametric framework for the graphical design of robotic facial expressions. Theoretically, it addresses the current gap in quantitative foundations for emotional HRI. Practically, it offers robust strategic guidance for the development of social robot interfaces. As a result, this research aligns with the theme “Design, Art & Technology” of IASDR 2025.

 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.