Guiding Lens-based Exploration using Annotation Graphs
Moonisa Ahsan, Fabio Marton, Ruggero Pintus, and Enrico Gobbetti
October 2021
Abstract
We introduce a novel approach for guiding users in the exploration of annotated 2D models using interactive visualization lenses. Information on the interesting areas of the model is encoded in an annotation graph generated at authoring time. Each graph node contains an annotation, in the form of a visual markup of the area of interest, as well as the optimal lens parameters that should be used to explore the annotated area and a scalar representing the annotation importance. Graph edges are used, instead, to represent preferred ordering relations in the presentation of annotations. A scalar associated to each edge determines the strength of this prescription. At run-time, the graph is exploited to assist users in their navigation by determining the next best annotation in the database and moving the lens towards it when the user releases interactive control. The selection is based on the current view and lens parameters, the graph content and structure, and the navigation history. This approach supports the seamless blending of an automatic tour of the data with interactive lens-based exploration. The approach is tested and discussed in the context of the exploration of multi-layer relightable models.
Reference and download information
Moonisa Ahsan, Fabio Marton, Ruggero Pintus, and Enrico Gobbetti. Guiding Lens-based Exploration using Annotation Graphs. In Proc. Smart Tools and Applications in Graphics (STAG). Pages 85-90, October 2021. DOI: 10.2312/stag.20211477. Honorable mention in best paper award category at STAG 2021.
Related multimedia productions
Bibtex citation record
@inproceedings{Ahsan:2021:GLE, author = {Moonisa Ahsan and Fabio Marton and Ruggero Pintus and Enrico Gobbetti}, title = {Guiding Lens-based Exploration using Annotation Graphs}, booktitle = {Proc. Smart Tools and Applications in Graphics (STAG)}, pages = {85--90}, month = {October}, year = {2021}, abstract = { We introduce a novel approach for guiding users in the exploration of annotated 2D models using interactive visualization lenses. Information on the interesting areas of the model is encoded in an annotation graph generated at authoring time. Each graph node contains an annotation, in the form of a visual markup of the area of interest, as well as the optimal lens parameters that should be used to explore the annotated area and a scalar representing the annotation importance. Graph edges are used, instead, to represent preferred ordering relations in the presentation of annotations. A scalar associated to each edge determines the strength of this prescription. At run-time, the graph is exploited to assist users in their navigation by determining the next best annotation in the database and moving the lens towards it when the user releases interactive control. The selection is based on the current view and lens parameters, the graph content and structure, and the navigation history. This approach supports the seamless blending of an automatic tour of the data with interactive lens-based exploration. The approach is tested and discussed in the context of the exploration of multi-layer relightable models. }, doi = {10.2312/stag.20211477}, note = {Honorable mention in best paper award category at STAG 2021}, url = {http://vic.crs4.it/vic/cgi-bin/bib-page.cgi?id='Ahsan:2021:GLE'}, }
The publications listed here are included as a means to ensure timely
dissemination of scholarly and technical work on a non-commercial basis.
Copyright and all rights therein are maintained by the authors or by
other copyright holders, notwithstanding that they have offered their works
here electronically. It is understood that all persons copying this
information will adhere to the terms and constraints invoked by each
author's copyright. These works may not be reposted without the
explicit permission of the copyright holder.
Please contact the authors if you are willing to republish this work in
a book, journal, on the Web or elsewhere. Thank you in advance.
All references in the main publication page are linked to a descriptive page
providing relevant bibliographic data and, possibly, a link to
the related document. Please refer to our main
publication repository page for a
page with direct links to documents.