Depth From Texture Integration

2019 IEEE International Conference on Computational Photography (ICCP)(2019)

引用 2|浏览15
暂无评分
摘要
We present a new approach for active ranging, which can be compounded with traditional methods such as active depth from defocus or off-axis structured illumination. The object is illuminated by an active textured pattern having high spatial-frequency content. The illumination texture varies in time while the object undergoes a focal sweep. Consequently, in a single exposure, the illumination textures are encoded as a function of the object depth. Per-object depth, a particular illumination texture, with its high spatial frequency content, is focused; the other textures, projected when the system is defocused, are blurred. Analysis of the time-integrated image decodes the depth map. The plurality of projected and sensed color channels enhances the performance of the process, as we demonstrate experimentally. Using a wide aperture and only one or two readout frames, the method is particularly useful for imaging that requires high sensitivity to weak signals and high spatial resolution. Using a focal sweep during an exposure, the imaging has a wide dynamic depth range while being fast.
更多
查看译文
关键词
computational imaging,focal sweep,three-dimensional shape recovery,active illumination,electrically tunable lens
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要