Abstract
Cinematic virtual reality (VR) aims to provide immersive visual experiences of real-world scenes on head-mounted displays. Current cinematic VR systems employ omnidirectional stereo videos from a fixed position, and therefore do not address head-motion parallax, which is an important cue for depth perception. We propose a new 3D video representation, referred to as depth augmented stereo panorama (DASP), to address this issue. DASP is developed considering data capture, postproduction, streaming, and rendering stages of the VR pipeline. The capabilities of this representation are evaluated by comparing the generated viewports with those from known 3D models. Results indicate that DASP can successfully create stereo and induce head-motion parallax in a predefined operating range.