Explainable Interfaces for Rapid Gaze-Based Interactions in Mixed Reality

Mengjie Yu, Dustin Harris, Ian Jones,Ting Zhang, Yue Liu,Naveen Sendhilnathan,Narine Kokhlikyan,Fulton Wang, Co Tran, Jordan L. Livingston, Krista E. Taylor, Zhenhong Hu, Mary A. Hood,Hrvoje Benko,Tanya R. Jonker

arxiv(2024)

引用 0|浏览3
暂无评分
摘要
Gaze-based interactions offer a potential way for users to naturally engage with mixed reality (XR) interfaces. Black-box machine learning models enabled higher accuracy for gaze-based interactions. However, due to the black-box nature of the model, users might not be able to understand and effectively adapt their gaze behaviour to achieve high quality interaction. We posit that explainable AI (XAI) techniques can facilitate understanding of and interaction with gaze-based model-driven system in XR. To study this, we built a real-time, multi-level XAI interface for gaze-based interaction using a deep learning model, and evaluated it during a visual search task in XR. A between-subjects study revealed that participants who interacted with XAI made more accurate selections compared to those who did not use the XAI system (i.e., F1 score increase of 10.8 their gaze behavior over time to make more effective selections. These findings suggest that XAI can potentially be used to assist users in more effective collaboration with model-driven interactions in XR.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要