Analysis and Recognition of Confidence Level Based on Eye Gaze and Head Movement Towards Human-Robot Co-Learning

2023 International Conference on Machine Learning and Cybernetics (ICMLC)(2023)

引用 0|浏览0
暂无评分
摘要
Understanding internal conditions of humans, such as emotion, concentration, confidence level, and so on, is a challenging topic for human-robot interaction, even recently. In this research, we focus on estimating confidence levels from physiological sensor data, eye gaze and head movement, aiming to offer support for self-study with robots which give learners hints. Previous research recognized confidence levels after obtaining data, which was considered as “offline recognition”. To augment the method from offline to real-time recognition, this paper compared participants' physiological data before/after hearing hints. Additionally, we tried to classify their confidence levels by machine learning. Due to differences in their physiological data, distribution of gaze points when coming up answers tends to be more clustered than that of considering answers. The performance of machine learning was around 60% of accuracy in estimating confidence levels. The result suggested that we need to focus on individual differences, modify experimental designs, and improve quality of data for future works.
更多
查看译文
关键词
Human-robot co-learning,Physiological data,Human internal condition,Confidence
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要