What does the person feel? Learning to infer applied forces during robot-assisted dressing.

ICRA(2017)

引用 42|浏览66
暂无评分
摘要
During robot-assisted dressing, a robot manipulates a garment in contact with a personu0027s body. Inferring the forces applied to the personu0027s body by the garment might enable a robot to provide more effective assistance and give the robot insight into what the person feels. However, complex mechanics govern the relationship between the robotu0027s end effector and these forces. Using a physics-based simulation and data-driven methods, we demonstrate the feasibility of inferring forces across a personu0027s body using only end effector measurements. Specifically, we present a long short-term memory (LSTM) network that at each time step takes a 9-dimensional input vector of force, torque, and velocity measurements from the robotu0027s end effector and outputs a force map consisting of hundreds of inferred force magnitudes across the personu0027s body. We trained and evaluated LSTMs on two tasks: pulling a hospital gown onto an arm and pulhng shorts onto a leg. For both tasks, the LSTMs produced force maps that were similar to ground truth when visualized as heat maps across the limbs. We also evaluated their performance in terms of root-mean-square error. Their performance degraded when the end effector velocity was increased outside the training range, but generalized well to limb rotations. Overall, our results suggest that robots could learn to infer the forces people feel during robot-assisted dressing, although the extent to which this will generalize to the real world remains an open question.
更多
查看译文
关键词
applied forces,robot-assisted dressing,garment,complex mechanics,physics-based simulation,data-driven methods,long short-term memory network,LSTM,9-dimensional input vector,velocity measurements,force map,inferred force magnitudes,hospital gown,heat maps,root-mean-square error,end effector velocity,training range
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要