Multi-face emotion detection for effective Human-Robot Interaction
Conférence : Communications avec actes dans un congrès international
The integration of dialogue interfaces in mobile devices has become ubiquitous, providing a wide array of
services. As technology progresses, humanoid robots designed with human-like features to interact effectively
with people are gaining prominence, and the use of advanced human-robot dialogue interfaces is continually
expanding. In this context, emotion recognition plays a crucial role in enhancing human-robot interaction by
enabling robots to understand human intentions. This research proposes a facial emotion detection interface
integrated into a mobile humanoid robot, capable of displaying real-time emotions from multiple individuals
on a user interface. To this end, various deep neural network models for facial expression recognition were
developed and evaluated under consistent computer-based conditions, yielding promising results. Afterwards,
a trade-off between accuracy and memory footprint was carefully considered to effectively implement this
application on a mobile humanoid robot.