VIR-SLAM: visual, inertial, and ranging SLAM for single and multi-robot systems

AUTONOMOUS ROBOTS(2021)

引用 24|浏览11
暂无评分
摘要
Monocular cameras coupled with inertial measurements generally give high performance visual inertial odometry. However, drift can be significant with long trajectories, especially when the environment is visually challenging. In this paper, we propose a system that leverages Ultra–WideBand (UWB) ranging with one static anchor placed in the environment to correct the accumulated error whenever the anchor is visible. We also use this setup for collaborative SLAM: different robots use mutual ranging (when available) and the common anchor to estimate the transformation between each other, facilitating map fusion. Our system consists of two modules: a double layer ranging, visual, and inertial odometry for single robots, and a transformation estimation module for collaborative SLAM. We test our system on public datasets by simulating UWB measurements as well as on real robots in different environments. Experiments validate our system and show our method can outperform pure visual-inertial odometry by more than 20%, and in visually challenging environments, our method works even when the visual-inertial pipeline has significant drift. Furthermore, we can compute the inter-robot transformation matrices for collaborative SLAM at almost no extra computation cost.
更多
查看译文
关键词
visual,inertial,vir-slam,multi-robot
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要