C02 | Physiologically Based Interaction and Adaptive Visualization

Prof. Albrecht Schmidt, Universität Stuttgart
Email | Website

Albrecht Schmidt

Prof. Harald Reiterer, Universität Konstanz 
Email | Website

Harald Reiterer

Romina Kettner – Website, Email

In this project, we research new methods and techniques for cognition-aware visualizations. The basic idea is that a cognition-aware adaptive visualization will observe the physiological response of a person while interacting with a system und use this implicit input. Electrical signals measured on the body (e.g. EEG, EMG, ECG, galvanic skin response), changes of physiological parameters (e.g. body temperature, respiration rate, and pulse) as well as the users’ gaze behavior are used to estimate cognitive load and understanding. Through experimental research, concepts and models for adaptive and dynamic visualizations will be created. Frameworks and tools will be realized and empirically validated.

Research Questions

How can we optimize interactive visualization to create the best possible user experience and to maximize task performance?

To address this, the following questions need to be addressed:

What are relevant quantitative parameters for the user experience in visualization tasks?

How can we estimate these parameters based on physiological measurements?

What forms of adaptation support influencing the experienced cognitive load?

How can these mechanisms be designed and implemented to change and adapt interactive visualizations?

Possible input modalities for physiological data: EEG, Eye-Tracking, EMG, Thermal Imaging. (left to right, top to bottom).

Abstract representation of a cognition-aware adaptive visualization.

Publications

  1. J. Karolus, P. W. Woźniak, and L. L. Chuang, “Towards Using Gaze Properties to Detect Language Proficiency,” in Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI ’16), New York, NY, USA, 2016, no. 118, p. 6.
  2. L. Lischke, P. Knierim, and H. Klinke, “Mid-Air Gestures for Window Management on Large Displays,” in Mensch und Computer 2015 - Tagungsband, Berlin, München, Boston, 2015, pp. 439–442.
  3. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016.
  4. L. Lischke, J. Grüninger, K. Klouche, A. Schmidt, P. Slusallek, and G. Jacucci, “Interaction Techniques for Wall-Sized Screens,” in Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces - ITS ’15, 2015, pp. 501–504.
  5. L. Lischke, V. Schwind, K. Friedrich, A. Schmidt, and N. Henze, “MAGIC-Pointing on Large High-Resolution Displays,” in CHI EA ’16 Proceedings of the 34rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems, 2016, pp. 1706–1712.
  6. J. Karolus, P. W. Woźniak, L. L. Chuang, and A. Schmidt, “Robust Gaze Features for Enabling Language Proficiency Awareness,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), New York, NY, USA, 2017, pp. 2998–3010.
  7. N. Flad, J. Ditz, H. H. Bülthoff, and L. L. Chuang, “Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering,” Second Workshop on Eye Tracking and Visualization, IEEE Visualization 2016, 2016.

...