B01 | Adaptive Self-Consistent Visualization

Prof. Daniel Weiskopf, Universität Stuttgart
Email | Website

Daniel Weiskopf

Prof. Daniel A. Keim, Universität Konstanz
Email | Website

Daniel Keim

Rudolf Netzel, Universität Stuttgart – Email | Website

Tanja Munz, Universität Stuttgart – Email

Nils Rodrigues, Universität Stuttgart - Email | Website

Our long-term goal targets the adaptation of the computer-based visualization process in order to make the visualization outcome consistent with the input data and the objectives of the visualization itself. Therefore, this project contributes to addressing the key challenge of obtaining reliable visualizations. Our model of self-consistency is formulated in a quantitative fashion: how much does the visualization result deviate from the intended result? The adaptation optimizes parameters of the visualization pipeline in order to minimize the deviation, including perceptual models, techniques from computer vision, efficient optimization methods and support for multi-objective optimization.

Research Questions

How can we build an efficient optimization framework for the adaptation of self-consistent visualization techniques?

What kind of quantitative models from perceptual psychology and computer vision literature can be used to optimize the encoding of direction information in 2D visualization?

Which quantitative models are useful for optimizing animated visualizations with moving patterns?

How can we adapt the visualization to maintain the structure of relative differences or derivatives from the input data in the form of contrast of the visualization images?

How can multi-objective optimization be used for overlaid visualization of multivariate data?

Readability evaluation of node-link diagrams of trajectories using different link types, which encode the direction. Here, a repeating comet texture is used to indicate the direction. The comet texture is adapted to the human perception of direction. Furthermore, node splatting is used to indicate node locations and spatial node density.

Publications

  1. N. Rodrigues and D. Weiskopf, “Nonlinear Dot Plots,” in IEEE VIS 2017, 2017.
  2. N. Rodrigues, R. Netzel, K. R. Ullah, M. Burch, A. Schultz, B. Burger, and D. Weiskopf, “Visualization of Time Series Data with Spatial Context: Communicating the Energy Production of Power Plants,” in Proceedings of the 10th International Symposium on Visual Information Communication and Interaction, New York, NY, USA, 2017, vol. VINCI '17, no. 8, pp. 37–44.
  3. K. Kurzhals, M. Hlawatsch, C. Seeger, and D. Weiskopf, “Visual Analytics for Mobile Eye Tracking,” in IEEE Transactions on Visualization and Computer Graphics, 2017, vol. 23, no. 1, pp. 301–310.
  4. M. Burch, M. Hlawatsch, and D. Weiskopf, “Visualizing a Sequence of a Thousand Graphs (or Even More),” in Computer Graphics Forum, 2017, vol. 36, no. 3, pp. 261–271.
  5. N. Rodrigues, M. Burch, L. Di Silvestro, and D. Weiskopf, “A Visual Analytics Approach for Word Relevances in Multiple Texts,” 2017.
  6. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016.
  7. R. Netzel, M. Burch, and D. Weiskopf, “Interactive Scanpath-Oriented Annotation of Fixations,” 2016, pp. 183–187.
  8. M. Burch, R. Woods, R. Netzel, and D. Weiskopf, “The Challenges of Designing Metro Maps,” 2016, pp. 195–202.
  9. K. Kurzhals, M. Burch, T. Pfeiffer, and D. Weiskopf, “Eye Tracking in Computer-Based Visualization,” Computing in Science and Engineering, vol. 17, no. 5, pp. 64–71, 2015.
  10. K. Kurzhals, M. Hlawatsch, F. Heimerl, M. Burch, T. Ertl, and D. Weiskopf, “Gaze Stripes: Image-Based Visualization of Eye Tracking Data,” IEEE Xplore Digital Library, 2015.

...