PanoStyleVR: style-based similarity metrics for Web-based immersive panoramic style transfer
Muhammad Tukur, Sara Jashari, Fabio Bettio, Giovanni Pintore, Enrico Gobbetti, Jens Schneider, and Marco Agus
September 2025
Abstract
We introduce PanoStyleVR, an immersive web-based framework for analyzing, ranking, and interactively applying style similarities within panoramic indoor scenes, enabling stereoscopic virtual exploration and photorealistic style adaptation. A key innovation of our system is a fully immersive WebXR interface, allowing users wearing head-mounted displays to navigate indoor environments in stereo and apply new styles in real time. Style suggestions are visualized through floating thumbnails rendered in the VR space; selecting a style triggers photorealistic transfer on the current room view and updates the immersive stereo representation. This interactive pipeline is powered by two integrated neural components: (1) a geometry-aware and shading-independent GAN-based framework for semantic style transfer on albedo-reflectance representations; and (2) a gated architecture that synthesizes omnidirectional stereoscopic views from a single 360^CIRC panorama for realistic depth-aware exploration. Our system enables cosine-similarity-based style ranking, t-SNE-driven dimensionality reduction, and GMM-based clustering over large-scale panoramic datasets. These components support an immersive recommendation mechanism that connects stylistic analysis with interactive editing. Experimental evaluations on the Structured3D dataset demonstrate strong alignment between perceptual similarity and our proposed metric, and effective grouping of panoramas based on latent style representations.
Reference and download information
Muhammad Tukur, Sara Jashari, Fabio Bettio, Giovanni Pintore, Enrico Gobbetti, Jens Schneider, and Marco Agus. PanoStyleVR: style-based similarity metrics for Web-based immersive panoramic style transfer. In Proc. ACM Web3D, September 2025. To appear.
Related multimedia productions
Bibtex citation record
@inproceedings{Tukur:2025:PSS, author = {Muhammad Tukur and Sara Jashari and Fabio Bettio and Giovanni Pintore and Enrico Gobbetti and Jens Schneider and Marco Agus}, title = {{PanoStyleVR}: style-based similarity metrics for Web-based immersive panoramic style transfer}, booktitle = {Proc. ACM Web3D}, month = {September}, year = {2025}, abstract = { We introduce PanoStyleVR, an immersive web-based framework for analyzing, ranking, and interactively applying style similarities within panoramic indoor scenes, enabling stereoscopic virtual exploration and photorealistic style adaptation. A key innovation of our system is a fully immersive \emph{WebXR} interface, allowing users wearing head-mounted displays to navigate indoor environments in stereo and apply new styles in real time. Style suggestions are visualized through floating thumbnails rendered in the VR space; selecting a style triggers photorealistic transfer on the current room view and updates the immersive stereo representation. This interactive pipeline is powered by two integrated neural components: (1) a geometry-aware and shading-independent GAN-based framework for semantic style transfer on albedo-reflectance representations; and (2) a gated architecture that synthesizes omnidirectional stereoscopic views from a single 360$^\circ$ panorama for realistic depth-aware exploration. Our system enables cosine-similarity-based style ranking, t-SNE-driven dimensionality reduction, and GMM-based clustering over large-scale panoramic datasets. These components support an immersive recommendation mechanism that connects stylistic analysis with interactive editing. Experimental evaluations on the Structured3D dataset demonstrate strong alignment between perceptual similarity and our proposed metric, and effective grouping of panoramas based on latent style representations. }, note = {To appear}, url = {http://vic.crs4.it/vic/cgi-bin/bib-page.cgi?id='Tukur:2025:PSS'}, }
The publications listed here are included as a means to ensure timely
dissemination of scholarly and technical work on a non-commercial basis.
Copyright and all rights therein are maintained by the authors or by
other copyright holders, notwithstanding that they have offered their works
here electronically. It is understood that all persons copying this
information will adhere to the terms and constraints invoked by each
author's copyright. These works may not be reposted without the
explicit permission of the copyright holder.
Please contact the authors if you are willing to republish this work in
a book, journal, on the Web or elsewhere. Thank you in advance.
All references in the main publication page are linked to a descriptive page
providing relevant bibliographic data and, possibly, a link to
the related document. Please refer to our main
publication repository page for a
page with direct links to documents.