Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation
IEEE Transactions on Robotics(2023)
摘要
Optical tactile sensors have recently become popular. They provide high
spatial resolution, but struggle to offer fine temporal resolutions. To
overcome this shortcoming, we study the idea of replacing the RGB camera with
an event-based camera and introduce a new event-based optical tactile sensor
called Evetac. Along with hardware design, we develop touch processing
algorithms to process its measurements online at 1000 Hz. We devise an
efficient algorithm to track the elastomer's deformation through the imprinted
markers despite the sensor's sparse output. Benchmarking experiments
demonstrate Evetac's capabilities of sensing vibrations up to 498 Hz,
reconstructing shear forces, and significantly reducing data rates compared to
RGB optical tactile sensors. Moreover, Evetac's output and the marker tracking
provide meaningful features for learning data-driven slip detection and
prediction models. The learned models form the basis for a robust and adaptive
closed-loop grasp controller capable of handling a wide range of objects. We
believe that fast and efficient event-based tactile sensors like Evetac will be
essential for bringing human-like manipulation capabilities to robotics. The
sensor design is open-sourced at https://sites.google.com/view/evetac .
更多查看译文
关键词
Force and Tactile Sensing,Perception for Grasping and Manipulation,Deep Learning in Robotics and Automation,Event-based Sensing
AI 理解论文
溯源树
样例
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要