PanoVerse: automatic generation of stereoscopic environments from single indoor panoramic images for Metaverse applications
Giovanni Pintore, Alberto Jaspe Villanueva, Markus Hadwiget, Enrico Gobbetti, Jens Schneider, and Marco Agus
October 2023
Abstract
We present a novel framework, dubbed PanoVerse, for the automatic creation and presentation of immersive stereoscopic environments from a single indoor panoramic image. Once per 360^CIRC shot, a novel data-driven architecture generates a fixed set of panoramic stereo pairs distributed around the current central view-point. Once per frame, directly on the HMD, we rapidly fuse the precomputed views to seamlessly cover the exploration workspace. To realize this system, we introduce several novel techniques that combine and extend state-of-the art data-driven techniques. In particular, we present a gated architecture for panoramic monocular depth estimation and, starting from the re-projection of visible pixels based on predicted depth, we exploit the same gated architecture for inpainting the occluded and disoccluded areas, introducing a mixed GAN with self-supervised loss to evaluate the stereoscopic consistency of the generated images. At interactive rates, we interpolate precomputed panoramas to produce photorealistic stereoscopic views in a lightweight WebXR viewer. The system works on a variety of available VR headsets and can serve as a base component for Metaverse applications. We demonstrate our technology on several indoor scenes from publicly available data.
Reference and download information
Giovanni Pintore, Alberto Jaspe Villanueva, Markus Hadwiget, Enrico Gobbetti, Jens Schneider, and Marco Agus. PanoVerse: automatic generation of stereoscopic environments from single indoor panoramic images for Metaverse applications. In Proc. Web3D 2023 - 28th International ACM Conference on 3D Web Technology, October 2023. DOI: 10.1145/3611314.3615914. Honorable mention award in the best paper category at Web3D 2023.
Related multimedia productions
Bibtex citation record
@inproceedings{Pintore:2023:PAG, author = {Giovanni Pintore and Alberto {Jaspe Villanueva} and Markus Hadwiget and Enrico Gobbetti and Jens Schneider and Marco Agus}, title = {PanoVerse: automatic generation of stereoscopic environments from single indoor panoramic images for Metaverse applications}, booktitle = {Proc. Web3D 2023 - 28th International ACM Conference on 3D Web Technology}, month = {October}, year = {2023}, abstract = { We present a novel framework, dubbed \textbf{PanoVerse}, for the automatic creation and presentation of immersive stereoscopic environments from a single indoor panoramic image. Once per 360$^\circ$ shot, a novel data-driven architecture generates a fixed set of panoramic stereo pairs distributed around the current central view-point. Once per frame, directly on the HMD, we rapidly fuse the precomputed views to seamlessly cover the exploration workspace. To realize this system, we introduce several novel techniques that combine and extend state-of-the art data-driven techniques. In particular, we present a gated architecture for panoramic monocular depth estimation and, starting from the re-projection of visible pixels based on predicted depth, we exploit the same gated architecture for inpainting the occluded and disoccluded areas, introducing a mixed GAN with self-supervised loss to evaluate the stereoscopic consistency of the generated images. At interactive rates, we interpolate precomputed panoramas to produce photorealistic stereoscopic views in a lightweight WebXR viewer. The system works on a variety of available VR headsets and can serve as a base component for Metaverse applications. We demonstrate our technology on several indoor scenes from publicly available data. }, doi = {10.1145/3611314.3615914}, note = {Honorable mention award in the best paper category at Web3D 2023}, url = {http://vic.crs4.it/vic/cgi-bin/bib-page.cgi?id='Pintore:2023:PAG'}, }
The publications listed here are included as a means to ensure timely
dissemination of scholarly and technical work on a non-commercial basis.
Copyright and all rights therein are maintained by the authors or by
other copyright holders, notwithstanding that they have offered their works
here electronically. It is understood that all persons copying this
information will adhere to the terms and constraints invoked by each
author's copyright. These works may not be reposted without the
explicit permission of the copyright holder.
Please contact the authors if you are willing to republish this work in
a book, journal, on the Web or elsewhere. Thank you in advance.
All references in the main publication page are linked to a descriptive page
providing relevant bibliographic data and, possibly, a link to
the related document. Please refer to our main
publication repository page for a
page with direct links to documents.