C05 | Human-Machine Interaction with Adaptive Multisensory Systems

Prof. Dr. Marc Ernst, Ulm University
Email | Website

Daniel Weiskopf

Prof. Dr. Albrecht Schmidt, LMU Munich
Email | Website

New image

Priscilla Balestrucci, Ulm University – Email | Website

User-adaptive systems are a recent trend in technological development. Designed to learn the characteristics of the user interacting with them, they change their own features to provide users with a targeted, personalized experience.

This project investigates human adaptive behavior in mutual-learning situations. A better understanding of adaptive human-machine interactions, and of human sensorimotor learning processes in particular, will provide guidelines, evaluation criteria, and recommendations that will be beneficial for all projects within SFB/Transregio 161 that focus on the design of user-adaptive systems and algorithms.

To achieve this goal, we will carry out behavioral experiments using human participants and base our empirical choices on the framework of optimal decision theory as derived from the Bayesian approach. This approach can be used as a tool to construct ideal observer models against which human performance can be compared.

Previous studies have shown, that adaptive features can create side effects, which might be undesirable. In applications requiring the users to improve their skills, for example, these side effects could impede the human learn process (Fig.1).

Research Questions

What are the determinants of mutual adaptation between an adaptable user and a useradaptive system?

How does mutual adaptation change based on the sensory modalities involved?

Can mutual adaptation enhance immersion in an interactive virtual environment?

Do the interaction capabilities and experiences learnt in an adaptive system generalize to real-life, non-adaptive scenarios?

Fig.1: Median of errors. In the non-adaptive condition (left panel) the error-based algorithm is not implemented, therefore the pointing error is the same as the displayed feedback error. In the adaptive condition (right panel) the system acquires the subject´s pointing error (dashed line) and partially corrects for it in the displayed feedback error (solid line). The pointing error interestingly increases in the adaptive condition over time.

Publications

  1. C. Schulz, M. Burch, and D. Weiskopf, Visual Data Cleansing of Eye Tracking Data. 2015.
  2. K. Kurzhals, M. Hlawatsch, M. Burch, and D. Weiskopf, “Fixation-Image Charts,” in ETRA, 2016, vol. Proceedings of the Symposium on Eye Tracking Research & Applications, no. 1, pp. 11–18.
  3. J. Görtler, M. Spicker, C. Schulz, D. Weiskopf, and O. Deussen, “Stippling of 2D scalar fields,” in IEEE Trans. Vis. Comput. Graph., 2019.
  4. C. Schulz et al., “Generative Data Models for Validation and Evaluation of Visualization Techniques.,” in BELIV Workshop 2016, 2016, pp. 112--124.
  5. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016.
  6. T. Blascheck, F. Beck, S. Baltes, T. Ertl, and D. Weiskopf, “Visual Analysis and Coding of Data-Rich User Behavior,” 2016.
  7. J. Goertler, C. Schulz, O. Deussen, and D. Weiskopf, “Bubble Treemaps for Uncertainty Visualization,” IEEE Transactions on Visualization and Computer Graphics, 2018.
  8. J. Görtler, R. Kehlbeck, and O. Deussen, “A visual exploration of Gaussian processes,” in Proceedings of the Workshop on Visualization for AI Explainability (VISxAI), 2018.
  9. C. Schulz, M. Burch, F. Beck, and D. Weiskopf, “Visual Data Cleansing of Low-Level Eye Tracking Data,” in Extended Papers of ETVIS 2015, 2016.
  10. Y. Wang, Z. Wang, C.-W. Fu, H. Schmauder, O. Deussen, and D. Weiskopf, “Image-based aspect ratio selection,” IEEE Transactions on Visualization and Computer Graphics, vol. 25, no. 1, 2019.
  11. C. Schulz, A. Nocaj, J. Goertler, O. Deussen, U. Brandes, and D. Weiskopf, “Probabilistic Graph Layout for Uncertain Network Visualization,” vol. 23, no. 1, 2017.
  12. T. Spinner, J. Körner, J. Görtler, and O. Deussen, “Towards an interpretable latent space,” in Workshop Vis. for AI Explainability (VISxAI), IEEE VIS Berlin, 2018.
  13. K. Srulijes et al., “Visualization of eye-head coordination while walking in healthy subjects and patients with neurodegenerative diseases.” 2017.
  14. K. Kurzhals, B. Fisher, M. Burch, and D. Weiskopf, “Eye Tracking Evaluation of Visual Analytics,” 2015.
  15. P. Gralka, C. Schulz, G. Reina, D. Weiskopf, and T. Ertl, “Visual Exploration of Memory Traces and Call Stacks,” VISSOFT 2017, 2017.
  16. C. Schulz, A. Zeyfang, M. van Garderen, H. Ben Lahmar, M. Herschel, and D. Weiskopf, “Simultaneous Visual Analysis of Multiple Software Hierarchies,” in 2018 IEEE Working Conference on Software Visualization (VISSOFT), 2018, pp. 87--95.
  17. C. Schulz, N. Rodrigues, K. Damarla, A. Henicke, and D. Weiskopf, “Visual Exploration of Mainframe Workloads,” in SA ’17 Symposium on Visualization, 2017.