Paper
28 October 2021 Appearance-based gaze estimation with multi-modal convolutional neural networks
Author Affiliations +
Proceedings Volume 11884, International Symposium on Artificial Intelligence and Robotics 2021; 118840L (2021) https://doi.org/10.1117/12.2603762
Event: International Symposium on Artificial Intelligence and Robotics 2021, 2021, Fukuoka, Japan
Abstract
Existing methods on appearance-based gaze estimation mostly regress gaze direction from eye images, neglecting facial information and head pose which can be much helpful. In this paper, we propose a robust appearance-based gaze estimation method that regresses gaze directions jointly from human face and eye. The face and eye regions are located based on the detected landmark points, and representations of the two modalities are modeled with the convolutional neural networks (CNN), which are finally combined for gaze estimation by a fused network. Furthermore, considering the various impact of different facial regions on human gaze, the spatial weights for facial area are learned automatically with an attention mechanism and are applied to refine the facial representation. Experimental results validate the benefits of fusing multiple modalities in gaze estimation on the Eyediap benchmark dataset, and the propose method can yield better performance to previous advanced methods.
© (2021) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Fei Wang, Yan Wang, and Teng Li "Appearance-based gaze estimation with multi-modal convolutional neural networks", Proc. SPIE 11884, International Symposium on Artificial Intelligence and Robotics 2021, 118840L (28 October 2021); https://doi.org/10.1117/12.2603762
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image fusion

Convolution

Eye models

Feature extraction

Convolutional neural networks

Data fusion

Network architectures

RELATED CONTENT


Back to Top