Deep Bayesian Future Fusion for Self-Supervised, High-Resolution, Off-Road Mapping
arxiv(2024)
摘要
The limited sensing resolution of resource-constrained off-road vehicles
poses significant challenges towards reliable off-road autonomy. To overcome
this limitation, we propose a general framework based on fusing the future
information (i.e. future fusion) for self-supervision. Recent approaches
exploit this future information alongside the hand-crafted heuristics to
directly supervise the targeted downstream tasks (e.g. traversability
estimation). However, in this paper, we opt for a more general line of
development - time-efficient completion of the highest resolution (i.e. 2cm per
pixel) BEV map in a self-supervised manner via future fusion, which can be used
for any downstream tasks for better longer range prediction. To this end,
first, we create a high-resolution future-fusion dataset containing pairs of
(RGB / height) raw sparse and noisy inputs and map-based dense labels. Next, to
accommodate the noise and sparsity of the sensory information, especially in
the distal regions, we design an efficient realization of the Bayes filter onto
the vanilla convolutional network via the recurrent mechanism. Equipped with
the ideas from SOTA generative models, our Bayesian structure effectively
predicts high-quality BEV maps in the distal regions. Extensive evaluation on
both the quality of completion and downstream task on our future-fusion dataset
demonstrates the potential of our approach.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要