B01 | Adaptive Self-Consistent Visualization

Prof. Daniel Weiskopf, Universität Stuttgart
Email | Website

Daniel Weiskopf

Prof. Daniel A. Keim, Universität Konstanz
Email | Website

Daniel Keim

Rudolf Netzel, Universität Stuttgart

Tanja Munz, Universität Stuttgart – Email | Website

Nils Rodrigues, Universität Stuttgart - Email | Website

Our long-term goal targets the adaptation of the computer-based visualization process in order to make the visualization outcome consistent with the input data and the objectives of the visualization itself. Therefore, this project contributes to addressing the key challenge of obtaining reliable visualizations. Our model of self-consistency is formulated in a quantitative fashion: how much does the visualization result deviate from the intended result? The adaptation optimizes parameters of the visualization pipeline in order to minimize the deviation, including perceptual models, techniques from computer vision, efficient optimization methods and support for multi-objective optimization.

Research Questions

How can we build an efficient optimization framework for the adaptation of self-consistent visualization techniques?

What kind of quantitative models from perceptual psychology and computer vision literature can be used to optimize the encoding of direction information in 2D visualization?

Which quantitative models are useful for optimizing animated visualizations with moving patterns?

How can we adapt the visualization to maintain the structure of relative differences or derivatives from the input data in the form of contrast of the visualization images?

How can multi-objective optimization be used for overlaid visualization of multivariate data?

Readability evaluation of node-link diagrams of trajectories using different link types, which encode the direction. Here, a repeating comet texture is used to indicate the direction. The comet texture is adapted to the human perception of direction. Furthermore, node splatting is used to indicate node locations and spatial node density.

Publications

  1. K. Kurzhals, M. Burch, T. Pfeiffer, and D. Weiskopf, “Eye Tracking in Computer-Based Visualization,” Computing in Science and Engineering, vol. 17, no. 5, pp. 64–71, 2015.
  2. R. Netzel, M. Hlawatsch, M. Burch, S. Balakrishnan, H. Schmauder, and D. Weiskopf, “An Evaluation of Visual Search Support in Maps,” IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, vol. 23, no. 1, pp. 421–430, 2017.
  3. K. Kurzhals, M. Hlawatsch, C. Seeger, and D. Weiskopf, “Visual Analytics for Mobile Eye Tracking,” in IEEE Transactions on Visualization and Computer Graphics, 2017, vol. 23, no. 1, pp. 301–310.
  4. R. Netzel, B. Ohlhausen, K. Kurzhals, R. Woods, M. Burch, and D. Weiskopf, “User performance and reading strategies for metro maps: An eye tracking study,” SPATIAL COGNITION AND COMPUTATION, vol. 17, no. 1–2, pp. 39–64, 2017.
  5. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016.
  6. R. Netzel, M. Burch, and D. Weiskopf, “Interactive Scanpath-Oriented Annotation of Fixations,” 2016, pp. 183–187.
  7. K. Kurzhals, E. Cetinkaya, Y. Hu, W. Wang, and D. Weiskopf, “‘Close to the action: eyetracking evaluation of speaker-following subtitles,’” 2017, pp. 6559–6568.
  8. M. Behrisch et al., “Quality Metrics for Information Visualization.,” EuroVis STAR, 2018.
  9. K. Kurzhals, M. Stoll, A. Bruhn, and D. Weiskopf, “FlowBrush: Optical Flow Art,” in Symposium on Computational Aesthetics, Sketch-Based Interfaces and Modeling, and Non-Photorealistic Animation and Rendering (EXPRESSIVE, co-located with SIGGRAPH)., 2017.
  10. N. Rodrigues et al., “Visualization of Time Series Data with Spatial Context: Communicating the Energy Production of Power Plants,” in Proceedings of the 10th International Symposium on Visual Information Communication and Interaction, New York, NY, USA, 2017, vol. VINCI ’17, no. 8, pp. 37--44.
  11. N. Rodrigues and D. Weiskopf, “Nonlinear Dot Plots,” in IEEE VIS 2017, 2017.
  12. N. Rodrigues, M. Burch, L. Di Silvestro, and D. Weiskopf, “A Visual Analytics Approach for Word Relevances in Multiple Texts,” 2017.
  13. R. Netzel, J. Vuong, U. Engelke, S. O’Donoghue, D. Weiskopf, and J. Heinrich, “Comparative eye-tracking evaluation of scatterplots and parallel coordinates,” Visual Informatics, vol. 1, no. 2, pp. 118–131, 2017.
  14. M. Burch, M. Hlawatsch, and D. Weiskopf, “Visualizing a Sequence of a Thousand Graphs (or Even More),” in Computer Graphics Forum, 2017, vol. 36, no. 3, pp. 261--271.
  15. N. Rodrigues, R. Netzel, J. Spalink, and D. Weiskopf, “Multiscale scanpath visualization and filtering,” in Workshop on Eye Tracking and Visualization, 2018, no. 2.
  16. M. Burch, R. Woods, R. Netzel, and D. Weiskopf, “The Challenges of Designing Metro Maps,” 2016, pp. 195–202.