Training a Camera to Perform Long-Distance Eye Tracking by another Eye-Tracker

IEEE Access(2019)

引用 11|浏览43
暂无评分
摘要
Appearance-based gaze estimation techniques have been greatly advanced in these years. However, using a single camera for appearance-based gaze estimation has been limited to short distance in previous studies. In addition, labeling of training samples has been a time-consuming and unfriendly step in previous appearance-based gaze estimation studies. To bridge these significant gaps, this paper presents a new long-distance gaze estimation paradigm: train a camera to perform eye tracking by another eye tracker, named Learning-based Single Camera eye tracker (LSC eye-tracker). In the training stage, the LSC eye-tracker simultaneously acquired gaze data by a commercial trainer eye tracker and face appearance images by a long-distance trainee camera, based on which deep convolutional neural network (CNN) models are utilized to learn the mapping from appearance images to gazes. In the application stage, the LSC eye-tracker works alone to predict gazes based on the acquired appearance images by the single camera and the trained CNN models. Our experimental results show that the LSC eye-tracker enables both population-based eye tracking and personalized eye tracking with promising accuracy and performance.
更多
查看译文
关键词
Eye tracking,gaze estimation,human-computer interaction,machine learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要