Accelerating Neural Field Training via Soft Mining
CVPR 2024(2023)
摘要
We present an approach to accelerate Neural Field training by efficiently
selecting sampling locations. While Neural Fields have recently become popular,
it is often trained by uniformly sampling the training domain, or through
handcrafted heuristics. We show that improved convergence and final training
quality can be achieved by a soft mining technique based on importance
sampling: rather than either considering or ignoring a pixel completely, we
weigh the corresponding loss by a scalar. To implement our idea we use Langevin
Monte-Carlo sampling. We show that by doing so, regions with higher error are
being selected more frequently, leading to more than 2x improvement in
convergence speed. The code and related resources for this study are publicly
available at https://ubc-vision.github.io/nf-soft-mining/.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要