In this project, we research new methods and techniques for cognition-aware visualizations. The basic idea is that a cognition-aware adaptive visualization will observe the physiological response of a person while interacting with a system und use this implicit input. Electrical signals measured on the body (e.g. EEG, EMG, ECG, galvanic skin response), changes of physiological parameters (e.g. body temperature, respiration rate, and pulse) as well as the users’ gaze behavior are used to estimate cognitive load and understanding. Through experimental research, concepts and models for adaptive and dynamic visualizations will be created. Frameworks and tools will be realized and empirically validated.
How can we optimize interactive visualization to create the best possible user experience and to maximize task performance?
To address this, the following questions need to be addressed:
What are relevant quantitative parameters for the user experience in visualization tasks?
How can we estimate these parameters based on physiological measurements?
What forms of adaptation support influencing the experienced cognitive load?
How can these mechanisms be designed and implemented to change and adapt interactive visualizations?
Models and Measures
Completed
Adaptive Algorithms
Completed
Interaction
Completed
Applications
Completed
FOR SCIENTISTS
Projects
People
Publications
Graduate School
Equal Opportunity
FOR PUPILS
PRESS AND MEDIA