Exploring Eye Expressions for Enhancing EOG-Based Interaction.

INTERACT (4)(2023)

Cited 0|Views13
No score
Abstract
This paper explores the classification of eye expressions for EOG-based interaction using JINS MEME, an off-the-shelf eye-tracking device. Previous studies have demonstrated the potential for using electrooculography (EOG) for hands-free human-computer interaction using eye movements (directional, smooth pursuit) and eye expressions (blinking, winking). We collected a comprehensive set of 14 eye gestures to explore how well both types of eye gestures be classified together in a machine learning model. Using a Random Forest classifier trained on our collected data using 15 engineered features, we obtained an overall classification performance of 0.77 (AUC). Our results show that we can reliably classify eye expressions, enhancing the range of available eye gestures for hands-free interaction. With continued development and refinement in EOG-based technology, our findings have long-term implications for improving the usability of the technology in general and for individuals who require a richer vocabulary of eye gestures to interact hands-free.
More
Translated text
Key words
eye expressions,interaction,eog-based
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined