A novel deep learning approach for facial emotion recognition: application to detecting emotional responses in elderly individuals with Alzheimer’s disease
Authors : Amine Bohi (LINEACT), Yassine El Boudouri (CRIStAL), Imad Sfeir
Article : Articles dans des revues internationales ou nationales avec comité de lecture - 13/06/2024 - Neural Computing and Applications
Facial expressions are a critical form of non-verbal communication, conveying a wide range of emotions. Recent advancements in artificial intelligence and computer vision have led to the development of deep learning methods, particularly convolutional neural networks, that are highly effective in facial emotion recognition (FER). This paper presents EmoNeXt, an advanced deep learning framework for FER that builds upon a modified ConvNeXt architecture and incorporates several key innovations. EmoNeXt integrates Spatial Transformer Networks (STNs) to enable the model to focus on the most expressive regions of the face, Squeeze-and-Excitation (SE) blocks to enhance channel dependencies, and a self-attention regularization term that encourages the learning of compact and discriminative feature vectors. Initially evaluated on the FER2013 dataset, EmoNeXt is now further validated on two other widely used benchmark datasets, AffectNet and CK+, to demonstrate its robustness and generalizability across various real-world and posed scenarios. Additionally, we conduct an extensive ablation study to analyze and quantify the contribution of each enhancement, confirming their positive impact on model performance. Finally, this paper explores the application of EmoNeXt in emotion recognition for elderly individuals with Alzheimer’s disease (AD), highlighting the urgent need for accurate emotion recognition to improve patient care. Our results underscore the potential of EmoNeXt as a valuable tool for enhancing emotional communication in healthcare settings, particularly for patients with neurodegenerative disorders.