Deep Learning For Tactile Understanding From Visual And Haptic Data

2016 IEEE International Conference on Robotics and Automation (ICRA)(2016)

引用 278|浏览160
暂无评分
摘要
Robots which interact with the physical world will benefit from a fine-grained tactile understanding of objects and surfaces. Additionally, for certain tasks, robots may need to know the haptic properties of an object before touching it. To enable better tactile understanding for robots, we propose a method of classifying surfaces with haptic adjectives (e.g., compressible or smooth) from both visual and physical interaction data. Humans typically combine visual predictions and feedback from physical interactions to accurately predict haptic properties and interact with the world. Inspired by this cognitive pattern, we propose and explore a purely visual haptic prediction model. Purely visual models enable a robot to "feel" without physical interaction. Furthermore, we demonstrate that using both visual and physical interaction signals together yields more accurate haptic classification. Our models take advantage of recent advances in deep neural networks by employing a unified approach to learning features for physical interaction and visual observations. Even though we employ little domain specific knowledge, our model still achieves better results than methods based on hand-designed features.
更多
查看译文
关键词
tactile understanding,deep learning,haptic data,visual data,fine grained tactile,haptic adjectives,physical interaction data,visual interaction data,visual predictions,physical interactions,cognitive pattern,physical interaction signals,haptic classification,deep neural networks,physical interaction,visual observations
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要