Interacting with a Social Robot Affects Visual Perception of Space.

HRI(2020)

引用 19|浏览14
暂无评分
摘要
Human partners are very effective at coordinating in space and time. Such ability is particular remarkable considering that visual perception of space is a complex inferential process, which is affected by individual prior experience (e.g. the history of previous stimuli). As a result, two partners might perceive differently the same stimulus. Yet, they find a way to align their perception, as demonstrated by the high degree of coordination observed in sports or even in everyday gestures as shaking hands. Robots would need a similar ability to align with their partner's perception. However, to date there is no knowledge of how the inferential mechanism supporting visual perception operates during social interaction. In the current work, we use a humanoid robot to address this question. We replicate a standard protocol for the quantification of perceptual inference in a HRI setting. Participants estimated the length of a set of segments presented by the humanoid robot iCub. The robot behaved in one condition as a mechanical arm driven by a computer and in another condition as an interactive, social partner. Even if the stimuli presented were the same in the two conditions, length perception was different when the robot was judged as an interactive agent rather than a mechanical tool. When playing with the social robot, participants relied significantly less on stimulus history. This result suggests that the brain changes optimization strategies during interaction and lay the foundations to design human-aware robot visual perception.
更多
查看译文
关键词
Social computing,Robot kinematics,Humanoid robots,Human-robot interaction,Tools,History,Robots
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要