Efficient and user-friendly visualization of neural relightable images for cultural heritage applications
Leonardo Righetto, Mohammad Khademizadeh, Andrea Giachetti, Federico Ponchio, Davit Gigilashvili, Fabio Bettio, and Enrico Gobbetti
2024
Abstract
We introduce an innovative multiresolution framework for encoding and interactively visualizing large relightable images using a neural reflectance model derived from a state-of-the-art technique. The framework is seamlessly integrated into a scalable multi-platform framework that supports adaptive streaming and exploration of multi-layered relightable models in web settings. To enhance efficiency, we optimized the neural model, simplified decoding, and implemented a custom WebGL shader specific to the task, eliminating the need for deep-learning library integration in the code. Additionally, we introduce an efficient level-of-detail management system supporting fine-grained adaptive rendering through on-the-fly resampling in latent feature space. The resulting viewer facilitates interactive neural relighting of large images. Its modular design allows the incorporation of functionalities for Cultural Heritage analysis, such as loading and simultaneous visualization of multiple relightable layers with arbitrary rotations.
Reference and download information
Leonardo Righetto, Mohammad Khademizadeh, Andrea Giachetti, Federico Ponchio, Davit Gigilashvili, Fabio Bettio, and Enrico Gobbetti. Efficient and user-friendly visualization of neural relightable images for cultural heritage applications. ACM Journal on Computing and Cultural Heritage (JOCCH), 17, 2024. DOI: 10.1145/3690390. To appear.
Related multimedia productions
Bibtex citation record
@article{Righetto:2024:EUV, author = {Leonardo Righetto and Mohammad Khademizadeh and Andrea Giachetti and Federico Ponchio and Davit Gigilashvili and Fabio Bettio and Enrico Gobbetti}, title = {Efficient and user-friendly visualization of neural relightable images for cultural heritage applications}, journal = {ACM Journal on Computing and Cultural Heritage (JOCCH)}, volume = {17}, year = {2024}, abstract = { We introduce an innovative multiresolution framework for encoding and interactively visualizing large relightable images using a neural reflectance model derived from a state-of-the-art technique. The framework is seamlessly integrated into a scalable multi-platform framework that supports adaptive streaming and exploration of multi-layered relightable models in web settings. To enhance efficiency, we optimized the neural model, simplified decoding, and implemented a custom WebGL shader specific to the task, eliminating the need for deep-learning library integration in the code. Additionally, we introduce an efficient level-of-detail management system supporting fine-grained adaptive rendering through on-the-fly resampling in latent feature space. The resulting viewer facilitates interactive neural relighting of large images. Its modular design allows the incorporation of functionalities for Cultural Heritage analysis, such as loading and simultaneous visualization of multiple relightable layers with arbitrary rotations. }, doi = {10.1145/3690390}, note = {To appear}, url = {http://vic.crs4.it/vic/cgi-bin/bib-page.cgi?id='Righetto:2024:EUV'}, }
The publications listed here are included as a means to ensure timely
dissemination of scholarly and technical work on a non-commercial basis.
Copyright and all rights therein are maintained by the authors or by
other copyright holders, notwithstanding that they have offered their works
here electronically. It is understood that all persons copying this
information will adhere to the terms and constraints invoked by each
author's copyright. These works may not be reposted without the
explicit permission of the copyright holder.
Please contact the authors if you are willing to republish this work in
a book, journal, on the Web or elsewhere. Thank you in advance.
All references in the main publication page are linked to a descriptive page
providing relevant bibliographic data and, possibly, a link to
the related document. Please refer to our main
publication repository page for a
page with direct links to documents.