B01 | Adaptive Self-Consistent Visualization

Prof. Daniel Weiskopf, University of Stuttgart
Email | Website

Daniel Weiskopf

Prof. Sabine Storandt, University of Konstanz
Email | Website

Sabine Storandt

Nils Rodrigues, University of Stuttgart - Email | Website

Our long-term goal targets the adaptation of the computer-based visualization process in order to make the visualization outcome consistent with the input data and the objectives of the visualization itself. Therefore, this project contributes to addressing the key challenge of obtaining reliable visualizations. Our model of self-consistency is formulated in a quantitative fashion: how much does the visualization result deviate from the intended result? The adaptation optimizes parameters of the visualization pipeline in order to minimize the deviation, including perceptual models, techniques from computer vision, efficient optimization methods and support for multi-objective optimization.

Research Questions

How can we build an efficient optimization framework for the adaptation of self-consistent visualization techniques?

What kind of quantitative models from perceptual psychology and computer vision literature can be used to optimize the encoding of direction information in 2D visualization?

Which quantitative models are useful for optimizing animated visualizations with moving patterns?

How can we adapt the visualization to maintain the structure of relative differences or derivatives from the input data in the form of contrast of the visualization images?

How can multi-objective optimization be used for overlaid visualization of multivariate data?

Fig. 1: Readability evaluation of node-link diagrams of trajectories using different link types, which encode the direction. Here, a repeating comet texture is used to indicate the direction. The comet texture is adapted to the human perception of direction. Furthermore, node splatting is used to indicate node locations and spatial node density.

Publications

  1. N. Rodrigues, C. Schulz, A. Lhuillier, and D. Weiskopf, “Cluster-Flow Parallel Coordinates: Tracing Clusters Across Subspaces,” in Proceedings of the Graphics Interface Conference (GI) (forthcoming), 2020, pp. 0:1-0:11, [Online]. Available: https://openreview.net/forum?id=oVHjlwLkl-.
  2. L. Zhou, M. Rivinius, C. R. Johnson, and D. Weiskopf, “Photographic High-Dynamic-Range Scalar Visualization,” IEEE Transactions on Visualization and Computer Graphics, vol. 26, no. 6, Art. no. 6, 2020, doi: 10.1109/TVCG.2020.2970522.
  3. N. Pathmanathan et al., “Eye vs. Head: Comparing Gaze Methods for Interaction in Augmented Reality,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), Stuttgart, Germany, 2020, pp. 50:1-50:5, doi: 10.1145/3379156.3391829.
  4. K. Kurzhals et al., “Visual Analytics and Annotation of Pervasive Eye Tracking Video,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), Stuttgart, Germany, 2020, pp. 16:1-16:9, doi: 10.1145/3379155.3391326.
  5. S. Öney et al., “Evaluation of Gaze Depth Estimation from Eye Tracking in Augmented Reality,” in Proceedings of the Symposium on Eye Tracking Research & Applications-Short Paper (ETRA-SP), 2020, pp. 49:1-49:5, doi: 10.1145/3379156.3391835.
  6. V. Bruder, K. Kurzhals, S. Frey, D. Weiskopf, and T. Ertl, “Space-Time Volume Visualization of Gaze and Stimulus,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), 2019, pp. 12:1-12:9, doi: 10.1145/3314111.3319812.
  7. R. Netzel, N. Rodrigues, A. Haug, and D. Weiskopf, “Compensation of Simultaneous Orientation Contrast in Superimposed Textures,” in Proceedings of the Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP), 2019, vol. 3: IVAPP, pp. 48–57, doi: 10.5220/0007356800480057.
  8. N. Silva et al., “Eye Tracking Support for Visual Analytics Systems: Foundations, Current Applications, and Research Challenges,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), 2019, pp. 11:1-11:9, doi: 10.1145/3314111.3319919.
  9. L. Zhou, R. Netzel, D. Weiskopf, and C. R. Johnson, “Spectral Visualization Sharpening.,” in Proceedings of the ACM Symposium on Applied Perception (SAP), 2019, pp. 18:1-18:9, doi: 10.1145/3343036.3343133.
  10. V. Bruder et al., “Volume-Based Large Dynamic Graph Analysis Supported by Evolution Provenance,” Multimedia Tools and Applications, vol. 78, no. 23, Art. no. 23, 2019, doi: 10.1007/s11042-019-07878-6.
  11. M. Behrisch et al., “Quality Metrics for Information Visualization,” Computer Graphics Forum, vol. 37, no. 3, Art. no. 3, 2018, doi: https://doi.org/10.1111/cgf.13446.
  12. N. Rodrigues and D. Weiskopf, “Nonlinear Dot Plots,” IEEE Transactions on Visualization and Computer Graphics, vol. 24, no. 1, Art. no. 1, 2018, doi: 10.1109/TVCG.2017.2744018.
  13. C. Schulz, A. Zeyfang, M. van Garderen, H. Ben Lahmar, M. Herschel, and D. Weiskopf, “Simultaneous Visual Analysis of Multiple Software Hierarchies,” in Proceedings of the IEEE Working Conference on Software Visualization (VISSOFT), 2018, pp. 87–95, doi: 10.1109/VISSOFT.2018.00017.
  14. N. Rodrigues, R. Netzel, J. Spalink, and D. Weiskopf, “Multiscale Scanpath Visualization and Filtering,” in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS), 2018, pp. 2:1-2:5, doi: 10.1145/3205929.3205931.
  15. R. Netzel, M. Hlawatsch, M. Burch, S. Balakrishnan, H. Schmauder, and D. Weiskopf, “An Evaluation of Visual Search Support in Maps,” IEEE Transactions on Visualization and Computer Graphics, vol. 23, no. 1, Art. no. 1, 2017, doi: 10.1109/TVCG.2016.2598898.
  16. K. Kurzhals, M. Hlawatsch, C. Seeger, and D. Weiskopf, “Visual Analytics for Mobile Eye Tracking,” IEEE Transactions on Visualization and Computer Graphics, vol. 23, no. 1, Art. no. 1, 2017, doi: 10.1109/TVCG.2016.2598695.
  17. K. Kurzhals, E. Çetinkaya, Y. Hu, W. Wang, and D. Weiskopf, “Close to the Action: Eye-tracking Evaluation of Speaker-following Subtitles,” in Proceedings of the CHI Conference on Human Factors in Computing Systems, 2017, pp. 6559–6568, doi: https://doi.org/10.1145/3025453.3025772.
  18. K. Kurzhals, M. Stoll, A. Bruhn, and D. Weiskopf, “FlowBrush: Optical Flow Art,” in Symposium on Computational Aesthetics, Sketch-Based Interfaces and Modeling, and Non-Photorealistic Animation and Rendering (EXPRESSIVE, co-located with SIGGRAPH)., 2017, pp. 1:1-1:9, doi: 10.1145/3092912.3092914.
  19. N. Rodrigues et al., “Visualization of Time Series Data with Spatial Context: Communicating the Energy Production of Power Plants,” in Proceedings of the ACM Symposium on Visual Information Communication and Interaction (VINCI), 2017, pp. 37–44, doi: https://doi.org/10.1145/3105971.3105982.
  20. N. Rodrigues, M. Burch, L. Di Silvestro, and D. Weiskopf, “A Visual Analytics Approach for Word Relevances in Multiple Texts,” in Proceedings of the International Conference on Information Visualisation (IV), 2017, pp. 1–7, doi: 10.1109/iV.2017.62.
  21. R. Netzel, J. Vuong, U. Engelke, O, D. Weiskopf, and J. Heinrich, “Comparative Eye-tracking Evaluation of Scatterplots and Parallel Coordinates,” Visual Informatics, vol. 1, no. 2, Art. no. 2, 2017, doi: 10.1016/j.visinf.2017.11.001.
  22. M. Burch, M. Hlawatsch, and D. Weiskopf, “Visualizing a Sequence of a Thousand Graphs (or Even More),” Computer Graphics Forum, vol. 36, no. 3, Art. no. 3, 2017, doi: 10.1111/cgf.13185.
  23. C. Schulz, N. Rodrigues, K. Damarla, A. Henicke, and D. Weiskopf, “Visual Exploration of Mainframe Workloads,” in Proceedings of the SIGGRAPH Asia Symposium on Visualization, Article No. 4, 2017, pp. 4:1-4:7, [Online]. Available: http://dx.doi.org/10.1145/3139295.3139312.
  24. R. Netzel and D. Weiskopf, “Hilbert Attention Maps for Visualizing Spatiotemporal Gaze Data,” in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS), 2016, pp. 21–25, doi: 10.1109/ETVIS.2016.7851160.
  25. K. Kurzhals, M. Hlawatsch, M. Burch, and D. Weiskopf, “Fixation-Image Charts,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), 2016, vol. 1, pp. 11–18, [Online]. Available: http://dx.doi.org/10.1145/2857491.2857507.
  26. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016.
  27. T. Blascheck, F. Beck, S. Baltes, T. Ertl, and D. Weiskopf, “Visual Analysis and Coding of Data-rich User Behavior,” in Proceedings of the IEEE Conference on Visual Analytics Science and Technology (VAST), 2016, pp. 141–150, doi: 10.1109/VAST.2016.7883520.
  28. K. Kurzhals, M. Hlawatsch, F. Heimerl, M. Burch, T. Ertl, and D. Weiskopf, “Gaze Stripes: Image-based Visualization of Eye Tracking Data,” IEEE Transactions on Visualization and Computer Graphics, vol. 22, no. 1, Art. no. 1, 2016, doi: 10.1109/TVCG.2015.2468091.
  29. R. Netzel, M. Burch, and D. Weiskopf, “Interactive Scanpath-oriented Annotation of Fixations,” Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), pp. 183–187, 2016, doi: 10.1145/2857491.2857498.
  30. A. Kumar, R. Netzel, M. Burch, D. Weiskopf, and K. Mueller, “Multi-Similarity Matrices of Eye Movement Data,” in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS), 2016, pp. 26–30, doi: 10.1109/ETVIS.2016.7851161.
  31. R. Netzel, M. Burch, and D. Weiskopf, “User Performance and Reading Strategies for Metro Maps: An Eye Tracking Study,” Spatial Cognition and Computation, Special Issue: Eye Tracking for Spatial Research, 2016, doi: http://dx.doi.org/10.1080/13875868.2016.1226839.
  32. K. Kurzhals, B. Fisher, M. Burch, and D. Weiskopf, “Eye Tracking Evaluation of Visual Analytics,” Information Visualization, vol. 15, no. 4, Art. no. 4, 2016, doi: 10.1177/1473871615609787.
  33. M. Burch, R. Woods, R. Netzel, and D. Weiskopf, “The Challenges of Designing Metro Maps,” in Proceedings of the Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP), 2016, vol. 2: IVAPP, doi: 10.5220/0005679601950202.
  34. K. Kurzhals, M. Burch, T. Pfeiffer, and D. Weiskopf, “Eye Tracking in Computer-based Visualization,” Computing in Science & Engineering, vol. 17, no. 5, Art. no. 5, 2015, doi: 10.1109/MCSE.2015.93.