Joint Action Perception To Enable Fluent Human-Robot Teamwork

2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)(2015)

引用 19|浏览17
暂无评分
摘要
To be effective team members, it is important for robots to understand the high-level behaviors of collocated humans. This is a challenging perceptual task when both the robots and people are in motion. In this paper, we describe an event-based model for multiple robots to automatically measure synchronous joint action of a group while both the robots and co-present humans are moving. We validated our model through an experiment where two people marched both synchronously and asynchronously, while being followed by two mobile robots. Our results suggest that our model accurately identifies synchronous motion, which can enable more adept human-robot collaboration.
更多
查看译文
关键词
effective team members,high-level behaviors,collocated humans,event-based model,automatic synchronous joint action measurement,mobile robots,synchronous motion,asynchronous motion,human-robot collaboration,fluent human-robot teamwork
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要