Casual 6-DoF: free-viewpoint panorama
using a handheld 360° camera

IEEE Transactions on Visualization and Computer Graphics (TVCG) 2022


Rongsen Chen, Fanglue Zhang, Simon Finnie, Andrew Chalmers, Taehyun Rhee

Computational Media Innovation Centre, Victoria University of Wellington   

{rongsen.chen, fanglue.zhang, simon.finnie, andrew.chalmers, taehyun.rhee}@vuw.ac.nz

Abstract


Responsive image

Six degrees-of-freedom (6-DoF) video provides telepresence by enabling users to move around in the captured scene with a wide field of regard. Compared to methods requiring sophisticated camera setups, the image-based rendering method based on photogrammetry can work with images captured with any poses, which is more suitable for casual users. However, existing image-based rendering methods are based on perspective images. When used to reconstruct 6-DoF views, it often requires capturing hundreds of images, making data capture a tedious and time-consuming process. In contrast to traditional perspective images, 360° images capture the entire surrounding view in a single shot, thus, providing a faster capturing process for 6-DoF view reconstruction. This paper presents a novel method to provide 6-DoF experiences over a wide area using an unstructured collection of 360° panoramas captured by a conventional 360° camera. Our method consists of 360° data capturing, novel depth estimation to produce a high-quality spherical depth panorama, and high-fidelity free-viewpoint generation. We compared our method against state-of-the-art methods, using data captured in various environments. Our method shows better visual quality and robustness in the tested scenes.


Overview


Responsive image

Our method provides real-time 6-DoF viewing experiences using 360° panoramic images captured by a handheld 360 ° camera. Given an unstructured collection of 360° monocular panoramic images, our method starts with an offline process to recover the orientation and position of each input panorama. We then recover the sparse and dense depth panoramas of the scene. We developed an iterative refinement process to refine the estimated depth to better quality. We then use depth-based image rendering to synthesize 360° RGB images using the recovered depth from the input panoramas. Our novel panoramic view synthesis method can synthesize panoramic images from novel viewpoints in 30fps.


Results


Responsive image

We compare our method with related methods: Unstructured Lumigraph Rendering (ULR), Inside-Out and DeepBlending. We monstrate our method produce more clear and complete result, in those outdoor capture 360 panoramic images.



Presentation Video



Citation


                
                @article{chen2022casual,
                title={Casual 6-DoF: free-viewpoint panorama using a handheld 360 camera},
                author={Chen, Rongsen and Zhang, Fang-Lue and Finnie, Simon and Chalmers, Andrew and Rhee, Taehyun},
                journal={IEEE Transactions on Visualization and Computer Graphics},
                year={2022},
                publisher={IEEE}}
              
            


Acknowledgement


This project was supported by the Computational Media Innovation Centre, Victoria University of Wellington, and the Entrepreneurial University Programme by the Tertiary Education Commission in New Zealand.