Where am I walking? - MultiNet based Proprioceptive Terrain Classification for Legged Robots

2023 20TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS, UR(2023)

引用 0|浏览1
暂无评分
摘要
Autonomous exploration in unknown and rough terrain is a challenging task for mobile robots. To better understand its surroundings a robot needs to perceive the ground. In most scenarios, this is done via visual perception. However, there might be circumstances such as fog or dust where the visual feedback is not reliable. Additionally, grounds can feel and react differently even though they look similar. Therefore, we propose an approach that utilizes the proprioceptive data of a walking robot to classify the ground and estimate ground properties. In our approach, we created a dataset of seven different terrains. A small Long Short Term Memory (LSTM) Neural Network was trained on the data and adapted in several experiments. With different preprocessing steps, an accuracy for the ground classification of up to 95.2% was reached while walking. When the robot is stomping in place, an accuracy of 98% was obtained. This approach provides a reliable additional modality for ground perception in challenging environments.
更多
查看译文
关键词
autonomous exploration,ground perception,legged robots,long short term memory neural network,mobile robots,preprocessing steps,proprioceptive terrain classification,visual feedback,visual perception,walking robot
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要