TY - JOUR
T1 - Multi-View Facial Expressions Analysis of Autistic Children in Social Play
AU - Zeng, Jiabei
AU - Yuan, Yujian
AU - Qu, Lu
AU - Chang, Fei
AU - Sun, Xuran
AU - Gong, Jinqiuyu
AU - Han, Xuling
AU - Liu, Min
AU - Zhao, Hang
AU - Liu, Qiaoyun
AU - Shan, Shiguang
AU - Chen, Xilin
N1 - Publisher Copyright:
© 2010-2012 IEEE.
PY - 2025
Y1 - 2025
N2 - Atypical facial expressions during interaction are among the early symptoms of autism spectrum disorder (ASD) and are included in standard diagnostic assessments. However, current methods rely on subjective human judgments, introducing bias and limiting objectivity. This paper proposes an automated framework for objective and quantitative assessment of autistic children’s facial expressions during social play. Initially, we utilize four synchronized cameras to record interactions between ASD children and teachers during structured activities dominated by the teacher. To address challenges posed by head movements and occluded faces, we introduce a multi-view facial expression recognition strategy. Its effectiveness is demonstrated by experiments in real-world applications. To quantify the patterns of affect status and the dynamic complexity of facial expressions, we use the temporally accumulated distribution of the basic facial expressions and the multi-dimensional multi-scale entropy of the facial expression sequence. Analysis of these features revealed significant differences between ASD and TD groups. Experimental results, derived from our quantified features, confirm conclusions drawn from previous research and experiential observations. With these facial expression features, ASD and typically developing (TD) children are accurately classified (accuracy 92.1%, precision 94.4%, 89.5% sensitivity, 94.7% specificity) in empirical experiments, suggesting the potential of our framework for improved ASD assessment.
AB - Atypical facial expressions during interaction are among the early symptoms of autism spectrum disorder (ASD) and are included in standard diagnostic assessments. However, current methods rely on subjective human judgments, introducing bias and limiting objectivity. This paper proposes an automated framework for objective and quantitative assessment of autistic children’s facial expressions during social play. Initially, we utilize four synchronized cameras to record interactions between ASD children and teachers during structured activities dominated by the teacher. To address challenges posed by head movements and occluded faces, we introduce a multi-view facial expression recognition strategy. Its effectiveness is demonstrated by experiments in real-world applications. To quantify the patterns of affect status and the dynamic complexity of facial expressions, we use the temporally accumulated distribution of the basic facial expressions and the multi-dimensional multi-scale entropy of the facial expression sequence. Analysis of these features revealed significant differences between ASD and TD groups. Experimental results, derived from our quantified features, confirm conclusions drawn from previous research and experiential observations. With these facial expression features, ASD and typically developing (TD) children are accurately classified (accuracy 92.1%, precision 94.4%, 89.5% sensitivity, 94.7% specificity) in empirical experiments, suggesting the potential of our framework for improved ASD assessment.
KW - Multi-view facial expression recognition
KW - autism spectrum disorder assessment
KW - facial expression analysis
KW - facial expression quantification
KW - multi-scale entropy
UR - https://www.scopus.com/pages/publications/105002610135
U2 - 10.1109/TAFFC.2025.3557458
DO - 10.1109/TAFFC.2025.3557458
M3 - 文章
AN - SCOPUS:105002610135
SN - 1949-3045
VL - 16
SP - 2200
EP - 2214
JO - IEEE Transactions on Affective Computing
JF - IEEE Transactions on Affective Computing
IS - 3
ER -