Progressive Depth Decoupling and Modulating for Flexible Depth Completion
arxiv(2024)
摘要
Image-guided depth completion aims at generating a dense depth map from
sparse LiDAR data and RGB image. Recent methods have shown promising
performance by reformulating it as a classification problem with two sub-tasks:
depth discretization and probability prediction. They divide the depth range
into several discrete depth values as depth categories, serving as priors for
scene depth distributions. However, previous depth discretization methods are
easy to be impacted by depth distribution variations across different scenes,
resulting in suboptimal scene depth distribution priors. To address the above
problem, we propose a progressive depth decoupling and modulating network,
which incrementally decouples the depth range into bins and adaptively
generates multi-scale dense depth maps in multiple stages. Specifically, we
first design a Bins Initializing Module (BIM) to construct the seed bins by
exploring the depth distribution information within a sparse depth map,
adapting variations of depth distribution. Then, we devise an incremental depth
decoupling branch to progressively refine the depth distribution information
from global to local. Meanwhile, an adaptive depth modulating branch is
developed to progressively improve the probability representation from
coarse-grained to fine-grained. And the bi-directional information interactions
are proposed to strengthen the information interaction between those two
branches (sub-tasks) for promoting information complementation in each branch.
Further, we introduce a multi-scale supervision mechanism to learn the depth
distribution information in latent features and enhance the adaptation
capability across different scenes. Experimental results on public datasets
demonstrate that our method outperforms the state-of-the-art methods. The code
will be open-sourced at [this https URL](https://github.com/Cisse-away/PDDM).
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要