B01 | Adaptive Self-Consistent Visualization

Prof. Daniel Weiskopf, Universität Stuttgart
Email | Website

Daniel Weiskopf

Prof. Daniel A. Keim, Universität Konstanz
Email | Website

Daniel Keim

Rudolf Netzel, Universität Stuttgart – Email | Website

Tanja Munz, Universität Stuttgart – Email

Nils Rodrigues, Universität Stuttgart - Email | Website

Our long-term goal targets the adaptation of the computer-based visualization process in order to make the visualization outcome consistent with the input data and the objectives of the visualization itself. Therefore, this project contributes to addressing the key challenge of obtaining reliable visualizations. Our model of self-consistency is formulated in a quantitative fashion: how much does the visualization result deviate from the intended result? The adaptation optimizes parameters of the visualization pipeline in order to minimize the deviation, including perceptual models, techniques from computer vision, efficient optimization methods and support for multi-objective optimization.

Research Questions

How can we build an efficient optimization framework for the adaptation of self-consistent visualization techniques?

What kind of quantitative models from perceptual psychology and computer vision literature can be used to optimize the encoding of direction information in 2D visualization?

Which quantitative models are useful for optimizing animated visualizations with moving patterns?

How can we adapt the visualization to maintain the structure of relative differences or derivatives from the input data in the form of contrast of the visualization images?

How can multi-objective optimization be used for overlaid visualization of multivariate data?

Readability evaluation of node-link diagrams of trajectories using different link types, which encode the direction. Here, a repeating comet texture is used to indicate the direction. The comet texture is adapted to the human perception of direction. Furthermore, node splatting is used to indicate node locations and spatial node density.


  1. Rodrigues, Nils; Weiskopf, Daniel (2017): „Nonlinear Dot Plots“. In: IEEE VIS 2017. (IEEE VIS 2017).
  2. Rodrigues, Nils; Netzel, Rudolf; Ullah, Kazi R.; u. a. (2017b): „Visualization of Time Series Data with Spatial Context: Communicating the Energy Production of Power Plants“. In: VINCI 2017. (VINCI 2017).
  3. Kurzhals, Kuno; Hlawatsch, Marcel; Seeger, Christof; u. a. (2017): „Visual Analytics for Mobile Eye Tracking“. In: IEEE Transactions on Visualization and Computer Graphics. (IEEE Transactions on Visualization and Computer Graphics), S. 301–310, DOI: 10.1109/TVCG.2016.2598695.
  4. Burch, Michael; Hlawatsch, Marcel; Weiskopf, Daniel (2017): „Visualizing a Sequence of a Thousand Graphs (or Even More)“. In: Computer Graphics Forum. (Computer Graphics Forum), S. 261–271, DOI: 10.1111/cgf.13185.
  5. Rodrigues, Nils; Burch, Michael; Di Silvestro, Lorenzo; u. a. (2017a): „A Visual Analytics Approach for Word Relevances in Multiple Texts“. In: 2017, International Conference Information Visualisation (IV) (Hrsg.).
  6. Weiskopf, D.; Burch, M.; Chuang, L. L.; u. a. (2016): Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer.
  7. Netzel, Rudolf; Burch, Michael; Weiskopf, Daniel (2016): „Interactive Scanpath-Oriented Annotation of Fixations“. In: of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Proceedings (Hrsg.), S. 183–187.
  8. Burch, Michael; Woods, Robin; Netzel, Rudolf; u. a. (2016): „The Challenges of Designing Metro Maps“. In: Proceedings of the 11th Joint Conference on Computer Vision, Imaging; Theory, Computer Graphics; Applications (Hrsg.), S. 195–202.
  9. Kurzhals, Kuno; Burch, Michael; Pfeiffer, Thies; u. a. (2015a): „Eye Tracking in Computer-Based Visualization“. In: Computing in Science and Engineering. (Computing in Science and Engineering) 17 (5), S. 64–71.
  10. Kurzhals, Kuno; Hlawatsch, Marcel; Heimerl, Florian; u. a. (2015b): „Gaze Stripes: Image-Based Visualization of Eye Tracking Data“. In: IEEE Xplore Digital Library. (IEEE Xplore Digital Library).