Effective interactive visualization of neural relightable images in a web-based multi-layered framework
Leonardo Righetto, Fabio Bettio, Federico Ponchio, Andrea Giachetti, and Enrico Gobbetti
2023
Abstract
Relightable images created from Multi-Light Image Collections (MLICs) are one of the most commonly employed models for interactive object exploration in cultural heritage. In recent years, neural representations have been shown to produce higher-quality images, at similar storage costs, with respect to the more classic analytical models such as Polynomial Texture Maps (PTM) or Hemispherical Harmonics (HSH). However, their integration in practical interactive tools has so far been limited due to the higher evaluation cost, making it difficult to employ them for interactive inspection of large images, and to the difficulty in integration cost, due to the need to incorporate deep-learning libraries in relightable renderers. In this paper, we illustrate how a state-of-the-art neural reflectance model can be directly evaluated, using common WebGL shader features, inside a multi-platform renderer. We then show how this solution can be embedded in a scalable framework capable to handle multi-layered relightable models in web settings. We finally show the performance and capabilities of the method on cultural heritage objects.
Reference and download information
Leonardo Righetto, Fabio Bettio, Federico Ponchio, Andrea Giachetti, and Enrico Gobbetti. Effective interactive visualization of neural relightable images in a web-based multi-layered framework. In The 21th Eurographics Workshop on Graphics and Cultural Heritage. Pages 57-66, 2023. DOI: 10.2312/gch.20231158.
Related multimedia productions
Bibtex citation record
@inproceedings{Righetto:2023:EIV, author = {Leonardo Righetto and Fabio Bettio and Federico Ponchio and Andrea Giachetti and Enrico Gobbetti}, title = {Effective interactive visualization of neural relightable images in a web-based multi-layered framework}, booktitle = {The 21th Eurographics Workshop on Graphics and Cultural Heritage}, pages = {57--66}, year = {2023}, abstract = { Relightable images created from Multi-Light Image Collections (MLICs) are one of the most commonly employed models for interactive object exploration in cultural heritage. In recent years, neural representations have been shown to produce higher-quality images, at similar storage costs, with respect to the more classic analytical models such as Polynomial Texture Maps (PTM) or Hemispherical Harmonics (HSH). However, their integration in practical interactive tools has so far been limited due to the higher evaluation cost, making it difficult to employ them for interactive inspection of large images, and to the difficulty in integration cost, due to the need to incorporate deep-learning libraries in relightable renderers. In this paper, we illustrate how a state-of-the-art neural reflectance model can be directly evaluated, using common WebGL shader features, inside a multi-platform renderer. We then show how this solution can be embedded in a scalable framework capable to handle multi-layered relightable models in web settings. We finally show the performance and capabilities of the method on cultural heritage objects.}, doi = {10.2312/gch.20231158}, url = {http://vic.crs4.it/vic/cgi-bin/bib-page.cgi?id='Righetto:2023:EIV'}, }
The publications listed here are included as a means to ensure timely
dissemination of scholarly and technical work on a non-commercial basis.
Copyright and all rights therein are maintained by the authors or by
other copyright holders, notwithstanding that they have offered their works
here electronically. It is understood that all persons copying this
information will adhere to the terms and constraints invoked by each
author's copyright. These works may not be reposted without the
explicit permission of the copyright holder.
Please contact the authors if you are willing to republish this work in
a book, journal, on the Web or elsewhere. Thank you in advance.
All references in the main publication page are linked to a descriptive page
providing relevant bibliographic data and, possibly, a link to
the related document. Please refer to our main
publication repository page for a
page with direct links to documents.