Single-Shot VR

SIGGRAPH '23: ACM SIGGRAPH 2023 Emerging Technologies(2023)

引用 0|浏览16
暂无评分
摘要
The physical world has contents at varying depths, allowing our eye to squish or relax to focus at different distances; this is commonly referred to as the accommodation cue for human eyes. To allow a realistic 3D viewing experience, it is crucial to support the accommodation cue—the 3D display needs to show contents at different depths. However, supporting the native focusing of the eye has been an immense challenge to 3D displays. Commercial near-eye VR displays, which use binocular disparity as the primary cue for inducing depth perception, fail this challenge since all contents they show arise from a fixed depth—ignoring the focusing of the eye. Many research prototypes of VR displays do account for the accommodation cue; however, supporting accommodation cues invariably comes with performance loss among other typically assessed criteria for 3D displays. To tackle these challenges, we present a novel kind of near-eye 3D display that can create 3D scenes supporting realistic accommodation cues in a single shot, i.e., without using time multiplexing or eye tracking. This display, which we present in our demo, can stream 3D content over a large depth range, at 4K spatial resolution, and in real-time. Our display offers an exciting step forward towards a truly immersive real-time 3D experience. Participants will get to enjoy 3D movies and play interactive games in their demo experience.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要