A Framework for Recognizing Industrial Actions via Joint Angles

2022 IEEE-RAS 21st International Conference on Humanoid Robots (Humanoids)(2022)

引用 1|浏览9
暂无评分
摘要
This paper proposes a novel framework for recognizing industrial actions, in the perspective of human-robot collaboration. Given a one second long measure of the human's motion, the framework can determine his/her action. The originality lies in the use of joint angles, instead of Cartesian coordinates. This design choice makes the framework sensor agnostic and invariant to affine transformations and to anthropometric differences. On AnDy dataset, we outperform the state of art classifier. Furthermore, we show that our framework is effective with limited training data, that it is subject independent, and that it is compatible with robotic real-time constraints. In terms of methodology, the framework is an original synergy of two antithetical schools of thought: model-based and data-based algorithms. Indeed, it is the cascade of an inverse kinematics estimator compliant with the International Society of Biomechanics recommendations, followed by a deep learning architecture based on Bidirectional Long Short Term Memory. We believe our work may pave the way to successful and fast action recognition with standard depth cameras, embedded on moving collaborative robots.
更多
查看译文
关键词
industrial actions,joint angles
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要