B01 | Adaptive Self-Consistent Visualization

Prof. Daniel Weiskopf, University of Stuttgart
Email | Website

Daniel Weiskopf

Prof. Daniel Keim, University of Konstanz
Email | Website

Daniel Keim

Tim Krake, University of Stuttgart - Email | Website

Our long-term goal targets the adaptation of the computer-based visualization process in order to make the visualization outcome consistent with the input data and the objectives of the visualization itself. Therefore, this project contributes to addressing the key challenge of obtaining reliable visualizations. Our model of self-consistency is formulated in a quantitative fashion: how much does the visualization result deviate from the intended result? The adaptation optimizes parameters of the visualization pipeline in order to minimize the deviation, including perceptual models, techniques from computer vision, efficient optimization methods and support for multi-objective optimization.

Research Questions

How can we build an efficient optimization framework for the adaptation of self-consistent visualization techniques?

What kind of quantitative models from perceptual psychology and computer vision literature can be used to optimize the encoding of direction information in 2D visualization?

Which quantitative models are useful for optimizing animated visualizations with moving patterns?

How can we adapt the visualization to maintain the structure of relative differences or derivatives from the input data in the form of contrast of the visualization images?

How can multi-objective optimization be used for overlaid visualization of multivariate data?

Fig. 1: Readability evaluation of node-link diagrams of trajectories using different link types, which encode the direction. Here, a repeating comet texture is used to indicate the direction. The comet texture is adapted to the human perception of direction. Furthermore, node splatting is used to indicate node locations and spatial node density.

Publications

  1. D. Klötzl, T. Krake, M. Becher, M. Koch, D. Weiskopf, and K. Kurzhals, “NMF-Based Analysis of Mobile Eye-Tracking Data,” in Proceedings of the 2024 Symposium on Eye Tracking Research and Applications, in Proceedings of the 2024 Symposium on Eye Tracking Research and Applications. 2024, pp. 1–9. doi: 10.1145/3649902.3653518.
  2. N. Rodrigues, C. Schulz, S. Döring, D. Baumgartner, T. Krake, and D. Weiskopf, “Relaxed Dot Plots: Faithful Visualization of Samples and Their Distribution,” IEEE Transactions on Visualization and Computer Graphics, vol. 29, no. 1, Art. no. 1, Jan. 2023, doi: 10.1109/TVCG.2022.3209429.
  3. K.-T. Chen et al., “Gazealytics : A Unified and Flexible Visual Toolkit for Exploratory and Comparative Gaze Analysis,” in ETRA ’23 : Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, in ETRA ’23 : Proceedings of the 2023 Symposium on Eye Tracking Research and Applications. Association for Computing Machinery, May 2023, pp. 1–7. doi: 10.1145/3588015.3589844.
  4. K.-T. Chen et al., “Reading Strategies for Graph Visualizations That Wrap Around in Torus Topology,” in Proceedings of the 2023 Symposium on Eye Tracking Research and Applications, in Proceedings of the 2023 Symposium on Eye Tracking Research and Applications. New York, NY, USA: Association for Computing Machinery, 2023. doi: 10.1145/3588015.3589841.
  5. M. Koch, K. Kurzhals, M. Burch, and D. Weiskopf, “Visualization Psychology for Eye Tracking Evaluation,” in Visualization Psychology, D. Albers Szafir, R. Borgo, M. Chen, D. J. Edwards, B. Fisher, and L. Padilla, Eds., in Visualization Psychology. , Cham: Springer International Publishing, 2023, pp. 243–260. doi: 10.1007/978-3-031-34738-2_10.
  6. F. Schreiber and D. Weiskopf, “Quantitative Visual Computing,” it - Information Technology, vol. 64, pp. 119–120, 2022, doi: 10.1515/itit-2022-0048.
  7. M. Koch, D. Weiskopf, and K. Kurzhals, “A Spiral into the Mind: Gaze Spiral Visualization for Mobile Eye Tracking,” Proceedings of the ACM on Computer Graphics and Interactive Techniques, vol. 5, no. 2, Art. no. 2, May 2022, doi: 10.1145/3530795.
  8. N. Rodrigues, L. Shao, J. J. Yan, T. Schreck, and D. Weiskopf, “Eye Gaze on Scatterplot: Concept and First Results of Recommendations for Exploration of SPLOMs Using Implicit Data Selection,” in 2022 Symposium on Eye Tracking Research and Applications, in 2022 Symposium on Eye Tracking Research and Applications. New York, NY, USA: Association for Computing Machinery, 2022, pp. 59:1-59:7. doi: 10.1145/3517031.3531165.
  9. K. Angerbauer et al., “Accessibility for Color Vision Deficiencies: Challenges and Findings of a Large Scale Study on Paper Figures,” in Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, in Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. New York, NY, USA: Association for Computing Machinery, 2022. doi: 10.1145/3491102.3502133.
  10. P. Schäfer, N. Rodrigues, D. Weiskopf, and S. Storandt, “Group Diagrams for Simplified Representation of Scanpaths,” in Proceedings of the ACM Symposium on Visual Information Communication and Interaction (VINCI), in Proceedings of the ACM Symposium on Visual Information Communication and Interaction (VINCI). ACM, Aug. 2022. doi: 10.1145/3554944.3554971.
  11. T. Krake, M. von Scheven, J. Gade, M. Abdelaal, D. Weiskopf, and M. Bischoff, “Efficient Update of Redundancy Matrices for Truss and Frame Structures,” Journal of Theoretical, Computational and Applied Mechanics, 2022, [Online]. Available: https://jtcam.episciences.org/10398
  12. T. Krake, A. Bruhn, B. Eberhardt, and D. Weiskopf, “Efficient and Robust Background Modeling with Dynamic Mode Decomposition,” Journal of Mathematical Imaging and Vision (2022), 2022, doi: 10.1007/s10851-022-01068-0.
  13. G. Richer, A. Pister, M. Abdelaal, J.-D. Fekete, M. Sedlmair, and D. Weiskopf, “Scalability in Visualization,” IEEE Transactions on Visualization and Computer Graphics, pp. 1–15, 2022.
  14. T. Krake, D. Klötzl, B. Eberhardt, and D. Weiskopf, “Constrained Dynamic Mode Decomposition,” IEEE Transactions on Visualization and Computer Graphics, pp. 1–11, 2022, doi: 10.1109/tvcg.2022.3209437.
  15. F. Chiossi et al., “Adapting visualizations and interfaces to the user,” it - Information Technology, vol. 64, pp. 133–143, 2022, doi: 10.1515/itit-2022-0035.
  16. R. Bian et al., “Implicit Multidimensional Projection of Local Subspaces,” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 2, Art. no. 2, 2021, doi: 10.1109/TVCG.2020.3030368.
  17. L. Zhou, C. R. Johnson, and D. Weiskopf, “Data-Driven Space-Filling Curves,” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 2, Art. no. 2, 2021, doi: 10.1109/TVCG.2020.3030473.
  18. L. Zhou, M. Rivinius, C. R. Johnson, and D. Weiskopf, “Photographic High-Dynamic-Range Scalar Visualization,” IEEE Transactions on Visualization and Computer Graphics, vol. 26, no. 6, Art. no. 6, 2020, doi: 10.1109/TVCG.2020.2970522.
  19. K. Kurzhals et al., “Visual Analytics and Annotation of Pervasive Eye Tracking Video,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA). ACM, 2020, pp. 16:1-16:9. doi: 10.1145/3379155.3391326.
  20. L. Merino, M. Schwarzl, M. Kraus, M. Sedlmair, D. Schmalstieg, and D. Weiskopf, “Evaluating Mixed and Augmented Reality: A Systematic Literature Review (2009 – 2019),” in IEEE International Symposium on Mixed and Augmented Reality (ISMAR), in IEEE International Symposium on Mixed and Augmented Reality (ISMAR). 2020. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/9284762
  21. D. Weiskopf, “Vis4Vis: Visualization for (Empirical) Visualization Research,” in Foundations of Data Visualization, M. Chen, H. Hauser, P. Rheingans, and G. Scheuermann, Eds., in Foundations of Data Visualization. , Springer International Publishing, 2020, pp. 209–224. doi: 10.1007/978-3-030-34444-3_10.
  22. N. Rodrigues, C. Schulz, A. Lhuillier, and D. Weiskopf, “Cluster-Flow Parallel Coordinates: Tracing Clusters Across Subspaces,” in Proceedings of the Graphics Interface Conference (GI) (forthcoming), in Proceedings of the Graphics Interface Conference (GI) (forthcoming). Canadian Human-Computer Communications Society / Société canadienne du dialogue humain-machine, 2020, pp. 0:1-0:11. doi: 10.20380/GI2020.38.
  23. A. Kumar, P. Howlader, R. Garcia, D. Weiskopf, and K. Mueller, “Challenges in Interpretability of Neural Networks for Eye Movement Data,” in ACM Symposium on Eye Tracking Research and Applications, in ACM Symposium on Eye Tracking Research and Applications. New York, NY, USA: Association for Computing Machinery, 2020. doi: 10.1145/3379156.3391361.
  24. N. Pathmanathan et al., “Eye vs. Head: Comparing Gaze Methods for Interaction in Augmented Reality,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA). ACM, 2020, pp. 50:1-50:5. doi: 10.1145/3379156.3391829.
  25. S. Öney et al., “Evaluation of Gaze Depth Estimation from Eye Tracking in Augmented Reality,” in Proceedings of the Symposium on Eye Tracking Research & Applications-Short Paper (ETRA-SP), in Proceedings of the Symposium on Eye Tracking Research & Applications-Short Paper (ETRA-SP). ACM, 2020, pp. 49:1-49:5. doi: 10.1145/3379156.3391835.
  26. A. Kumar, D. Mohanty, K. Kurzhals, F. Beck, D. Weiskopf, and K. Mueller, “Demo of the EyeSAC System for Visual Synchronization, Cleaning, and Annotation of Eye Movement Data,” in ACM Symposium on Eye Tracking Research and Applications, in ACM Symposium on Eye Tracking Research and Applications. New York, NY, USA: Association for Computing Machinery, 2020. doi: 10.1145/3379157.3391988.
  27. K. Kurzhals, M. Burch, and D. Weiskopf, “What We See and What We Get from Visualization: Eye Tracking Beyond Gaze Distributions and Scanpaths,” CoRR, 2020, [Online]. Available: https://arxiv.org/abs/2009.14515
  28. R. Garcia and D. Weiskopf, “Inner-Process Visualization of Hidden States in Recurrent Neural Networks,” in Proceedings of the 13th International Symposium on Visual Information Communication and Interaction, in Proceedings of the 13th International Symposium on Visual Information Communication and Interaction. New York, NY, USA: Association for Computing Machinery, 2020. doi: 10.1145/3430036.3430047.
  29. R. Netzel, N. Rodrigues, A. Haug, and D. Weiskopf, “Compensation of Simultaneous Orientation Contrast in Superimposed Textures,” in Proceedings of the Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP), A. Kerren, C. Hurter, and J. Braz, Eds., in Proceedings of the Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP). SciTePress, 2019, pp. 48–57. [Online]. Available: http://www.scitepress.org/DigitalLibrary/Link.aspx?doi=10.5220/0007356800480057
  30. V. Bruder et al., “Volume-Based Large Dynamic Graph Analysis Supported by Evolution Provenance,” Multimedia Tools and Applications, vol. 78, no. 23, Art. no. 23, 2019, doi: 10.1007/s11042-019-07878-6.
  31. V. Bruder, K. Kurzhals, S. Frey, D. Weiskopf, and T. Ertl, “Space-Time Volume Visualization of Gaze and Stimulus,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), K. Krejtz and B. Sharif, Eds., in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA). ACM, 2019, pp. 12:1-12:9. doi: 10.1145/3314111.3319812.
  32. L. Zhou, R. Netzel, D. Weiskopf, and C. R. Johnson, “Spectral Visualization Sharpening,” in Proceedings of the ACM Symposium on Applied Perception (SAP), S. Neyret, E. Kokkinara, M. González-Franco, L. Hoyet, D. W. Cunningham, and J. Swidrak, Eds., in Proceedings of the ACM Symposium on Applied Perception (SAP). ACM, 2019, pp. 18:1-18:9. doi: 10.1145/3343036.3343133.
  33. N. Silva et al., “Eye Tracking Support for Visual Analytics Systems: Foundations, Current Applications, and Research Challenges,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), K. Krejtz and B. Sharif, Eds., in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA). ACM, 2019, pp. 11:1-11:9. doi: 10.1145/3314111.3319919.
  34. N. Rodrigues, R. Netzel, J. Spalink, and D. Weiskopf, “Multiscale Scanpath Visualization and Filtering,” in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS), L. L. Chuang, M. Burch, and K. Kurzhals, Eds., in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS). ACM, 2018, pp. 2:1-2:5. doi: 10.1145/3205929.3205931.
  35. N. Rodrigues and D. Weiskopf, “Nonlinear Dot Plots,” IEEE Transactions on Visualization and Computer Graphics, vol. 24, no. 1, Art. no. 1, 2018, doi: 10.1109/TVCG.2017.2744018.
  36. C. Schulz, A. Zeyfang, M. van Garderen, H. Ben Lahmar, M. Herschel, and D. Weiskopf, “Simultaneous Visual Analysis of Multiple Software Hierarchies,” in Proceedings of the IEEE Working Conference on Software Visualization (VISSOFT), in Proceedings of the IEEE Working Conference on Software Visualization (VISSOFT). IEEE, 2018, pp. 87–95. [Online]. Available: https://ieeexplore.ieee.org/document/8530134/
  37. M. Behrisch et al., “Quality Metrics for Information Visualization,” Computer Graphics Forum, vol. 37, no. 3, Art. no. 3, 2018, doi: 10.1111/cgf.13446.
  38. N. Rodrigues, M. Burch, L. Di Silvestro, and D. Weiskopf, “A Visual Analytics Approach for Word Relevances in Multiple Texts,” in Proceedings of the International Conference on Information Visualisation (IV), in Proceedings of the International Conference on Information Visualisation (IV). IEEE, 2017, pp. 1–7. [Online]. Available: https://ieeexplore.ieee.org/document/8107940
  39. K. Kurzhals, M. Stoll, A. Bruhn, and D. Weiskopf, “FlowBrush: Optical Flow Art,” in Symposium on Computational Aesthetics, Sketch-Based Interfaces and Modeling, and Non-Photorealistic Animation and Rendering (EXPRESSIVE, co-located with SIGGRAPH)., in Symposium on Computational Aesthetics, Sketch-Based Interfaces and Modeling, and Non-Photorealistic Animation and Rendering (EXPRESSIVE, co-located with SIGGRAPH). 2017, pp. 1:1-1:9. doi: 10.1145/3092912.3092914.
  40. K. Kurzhals, E. Çetinkaya, Y. Hu, W. Wang, and D. Weiskopf, “Close to the Action: Eye-Tracking Evaluation of Speaker-Following Subtitles,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, ACM, Ed., in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2017, pp. 6559–6568. doi: 10.1145/3025453.3025772.
  41. C. Schulz, N. Rodrigues, K. Damarla, A. Henicke, and D. Weiskopf, “Visual Exploration of Mainframe Workloads,” in Proceedings of the SIGGRAPH Asia Symposium on Visualization, in Proceedings of the SIGGRAPH Asia Symposium on Visualization. ACM, 2017, pp. 4:1-4:7. doi: 10.1145/3139295.3139312.
  42. R. Netzel, J. Vuong, U. Engelke, S. I. O’Donoghue, D. Weiskopf, and J. Heinrich, “Comparative Eye-tracking Evaluation of Scatterplots and Parallel Coordinates,” Visual Informatics, vol. 1, no. 2, Art. no. 2, 2017, doi: 10.1016/j.visinf.2017.11.001.
  43. N. Rodrigues et al., “Visualization of Time Series Data with Spatial Context: Communicating the Energy Production of Power Plants,” in Proceedings of the ACM Symposium on Visual Information Communication and Interaction (VINCI), in Proceedings of the ACM Symposium on Visual Information Communication and Interaction (VINCI). 2017, pp. 37–44. doi: 10.1145/3105971.3105982.
  44. R. Netzel and D. Weiskopf, “Hilbert Attention Maps for Visualizing Spatiotemporal Gaze Data,” in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS), in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS). 2016, pp. 21–25. [Online]. Available: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7851160
  45. K. Kurzhals, M. Hlawatsch, M. Burch, and D. Weiskopf, “Fixation-Image Charts,” in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), ACM, Ed., in Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), vol. 1. ACM, 2016, pp. 11–18. doi: 10.1145/2857491.2857507.
  46. T. Blascheck, F. Beck, S. Baltes, T. Ertl, and D. Weiskopf, “Visual Analysis and Coding of Data-rich User Behavior,” in Proceedings of the IEEE Conference on Visual Analytics Science and Technology (VAST), G. L. Andrienko, S. Liu, and J. T. Stasko, Eds., in Proceedings of the IEEE Conference on Visual Analytics Science and Technology (VAST). IEEE, 2016, pp. 141–150. [Online]. Available: https://ieeexplore.ieee.org/document/7883520
  47. K. Kurzhals, B. Fisher, M. Burch, and D. Weiskopf, “Eye Tracking Evaluation of Visual Analytics,” Information Visualization, vol. 15, no. 4, Art. no. 4, 2016, doi: 10.1177/1473871615609787.
  48. R. Netzel, M. Burch, and D. Weiskopf, “User Performance and Reading Strategies for Metro Maps: An Eye Tracking Study,” Special Issue on Eye Tracking for Spatial Research in Spatial Cognition and Computation: An Interdisciplinary Journal, 2016, doi: 10.1080/13875868.2016.1226839.
  49. R. Netzel, M. Burch, and D. Weiskopf, “Interactive Scanpath-Oriented Annotation of Fixations,” Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, pp. 183–187, 2016, doi: 10.1145/2857491.2857498.
  50. M. Burch, R. Woods, R. Netzel, and D. Weiskopf, “The Challenges of Designing Metro Maps,” Proceedings of the 11th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, 2016, doi: 10.5220/0005679601950202.
  51. K. Kurzhals, M. Hlawatsch, F. Heimerl, M. Burch, T. Ertl, and D. Weiskopf, “Gaze Stripes: Image-Based Visualization of Eye Tracking Data,” IEEE Transactions on Visualization and Computer Graphics, vol. 22, no. 1, Art. no. 1, 2016, doi: 10.1109/TVCG.2015.2468091.
  52. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016. [Online]. Available: https://www.springer.com/de/book/9783319470238
  53. A. Kumar, R. Netzel, M. Burch, D. Weiskopf, and K. Mueller, “Multi-Similarity Matrices of Eye Movement Data,” in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS), in Proceedings of the Symposium on Eye Tracking and Visualization (ETVIS). 2016, pp. 26–30. [Online]. Available: https://ieeexplore.ieee.org/document/7851161
  54. K. Kurzhals, M. Burch, T. Pfeiffer, and D. Weiskopf, “Eye Tracking in Computer-based Visualization,” Computing in Science & Engineering, vol. 17, no. 5, Art. no. 5, 2015, doi: 10.1109/MCSE.2015.93.

Project Group A

Models and Measures

 

Completed

 

Project Group B

Adaptive Algorithms

 

Completed

 

Project Group C

Interaction

 

Completed

 

Project Group D

Applications

 

Completed