Joint Stabilization and Direction of 360\deg Videos

arxiv(2019)

引用 14|浏览96
暂无评分
摘要
360{\deg} video provides an immersive experience for viewers, allowing them to freely explore the world by turning their head. However, creating high-quality 360{\deg} video content can be challenging, as viewers may miss important events by looking in the wrong direction, or they may see things that ruin the immersion, such as stitching artifacts and the film crew. We take advantage of the fact that not all directions are equally likely to be observed; most viewers are more likely to see content located at ``true north'', i.e. in front of them, due to ergonomic constraints. We therefore propose 360{\deg} video direction, where the video is jointly optimized to orient important events to the front of the viewer and visual clutter behind them, while producing smooth camera motion. Unlike traditional video, viewers can still explore the space as desired, but with the knowledge that the most important content is likely to be in front of them. Constraints can be user guided, either added directly on the equirectangular projection or by recording ``guidance'' viewing directions while watching the video in a VR headset, or automatically computed, such as via visual saliency or forward motion direction. To accomplish this, we propose a new motion estimation technique specifically designed for 360{\deg} video which outperforms the commonly used 5-point algorithm on wide angle video. We additionally formulate the direction problem as an optimization where a novel parametrization of spherical warping allows us to correct for some degree of parallax effects. We compare our approach to recent methods that address stabilization-only and converting 360{\deg} video to narrow field-of-view video.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要