Adversarial Training for Adverse Conditions: Robust Metric Localisation using Appearance Transfer

2018 IEEE International Conference on Robotics and Automation (ICRA)(2018)

引用 113|浏览29
暂无评分
摘要
We present a method of improving visual place recognition and metric localisation under very strong appear- ance change. We learn an invertable generator that can trans- form the conditions of images, e.g. from day to night, summer to winter etc. This image transforming filter is explicitly designed to aid and abet feature-matching using a new loss based on SURF detector and dense descriptor maps. A network is trained to output synthetic images optimised for feature matching given only an input RGB image, and these generated images are used to localize the robot against a previously built map using traditional sparse matching approaches. We benchmark our results using multiple traversals of the Oxford RobotCar Dataset over a year-long period, using one traversal as a map and the other to localise. We show that this method significantly improves place recognition and localisation under changing and adverse conditions, while reducing the number of mapping runs needed to successfully achieve reliable localisation.
更多
查看译文
关键词
adversarial training,adverse conditions,robust metric localisation,appearance transfer,visual place recognition,invertable generator,image transforming filter,feature-matching,dense descriptor maps,output synthetic images,input RGB image,generated images,multiple traversals,reliable localisation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要