C03 | Immersive Virtual Environments

Dr. Lewis L. Chuang, LMU
Email | Website

Prof. Heinrich H. Bülthoff, Max Planck Institute for Biological Cybernetics
Email | Website

Lewis L. Chuang
Heinrich Bülthoff

Prof. Albrecht Schmidt, LMU
Email | Website

Albrecht Schmidt

Nina Flad, Max Planck Institute for Biological Cybernetics

Menja Scheer, Max Planck Institute for Biological Cybernetics

Alessandro Nesti, Max Planck Institute for Biological Cybernetics

Christiane Glatz, Max Planck Institute for Biological Cybernetics


Immersive virtual environments (IVEs) simulate real-world scenarios for purposes ranging from entertainment to safety-critical training (e.g., surgery, driving or flying). This project investigates how computationally demanding IVEs can be optimized to ensure realistic user performance, by focusing on how users seek out and process information for real-time operations. For this purpose, mobile gaze-tracking and EEG methods will be extended for use in IVEs. The context for research is aircraft simulation because it encompasses the rendering demands of a realistic world scene, intuitive visualization of in-vehicle instrumentation, and accurate synchronization of non-visual modalities (i.e., real motion).

Research Questions

How can users in IVEs be unobtrusively monitored without interrupting their activity?

How should visual information seeking and processing behavior be interpreted for the evaluation of visual computing?

How should visualizations be effectively combined with non-visual cues in a moving base simulator?

How are visualizations in IVE simulators relied on to support complex closed-loop control behavior?

Fig. 1: Immersive virtual environments provide visualisations in the form of a realistic world environment as well as abstract instruments. Both are important in supporting user interaction, especially in vehicle handling simulators (e.g., flight simulator). Gaze-tracking and electrophysiological recordings (e.g., EEG/ERP, heart-based measures) are respectively employed to evaluate how visual information is sought out and processed.


  1. A. Nesti, G. Rognini, B. Herbelin, H. H. Bülthoff, L. L. Chuang, and O. Blanke, “Modulation of vection latencies in the full-body illusion,” PLoS One, 2018.
  2. C. Glatz and L. L. Chuang, “The time course of auditory looming cues in redirecting visuo-spatial attention,” Scientific Reports, 2018.
  3. C. Glatz, S. Krupenia, H. Bülthoff, and L. Chuang, “Üse the right sound for the right job: verbal commands and auditory icons for a task-management system favor different information processes in the brain”,” in CHI Conference on Human Factors in Computing Systems, 2018, pp. 1–13.
  4. L. Chuang, C. Glatz, and S. Krupenia, “Üsing EEG to understand why behavior to auditory in-vehicle notifications differs across test Environments",” in 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2017, pp. 123–133.
  5. S. Borojeni, S. Boll, W. Heuten, H. Bülthoff, and L. Chuang, “‘Feel the movement: real motion influences responses to take-over requests in highly automated vehicles’,” 2018, pp. 1–13.
  6. J. Allsop, R. Gray, H. Bülthoff, and L. Chuang, “Eye movement planning on single-sensor-singleindicator displays is vulnerable to user anxiety and cognitive load,” Journal of Eye Movement Research, vol. 10, no. 5:8, pp. 1–15, 2017.
  7. T. Kosch, M. Funk, A. Schmidt, and L. L. Chuang, “Identifying cognitive assistance with mobile electroencephalography: a case study with in-situ projections for manual assembly,” vol. 2, no. EICS, p. 11. ACM on Human-Computer Interaction, 2018.
  8. M. Scheer, H. H. Bülthoff, and L. L. Chuang, “Auditory task irrelevance: A basis for inattentional deafness.,” Human Factors: The Journal of the Human Factors and Ergonomics Society, pp. 1--13, 2018.
  9. V. Schwind, P. Knierim, L. Chuang, and N. Henze, “‘Where’s Pinky?’: The Effects of a Reduced Number of Fingers in Virtual Reality,” in Proceedings of the 2017 CHI Conference on Computer-Human Interaction in Play, Amsterdam, Netherlands, 2017, vol. CHI PLAY’17, p. 6.
  10. K. de Winkel, A. Nesti, H. Ayaz, and H. Bülthoff, “Neural correlates of decision making on whole body yaw rotation: an fNIRS study,” Neuroscience Letters, 2017.
  11. L. Lischke, S. Mayer, K. Wolf, N. Henze, H. Reiterer, and A. Schmidt, “Screen arrangements and interaction areas for large display work places.,” vol. Proceedings of the 5th ACM International Symposium on Pervasive Displays PerDis 2016, pp. 228–234, 2016.
  12. A. Nesti, K. de Winkel, and H. Bülthoff, “Accumulation of inertial sensory information in the perception of whole body yaw rotation,” One, Plos, 2017.
  13. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016.
  14. C. L. L. and B. H. H., “Towards a Better Understanding of Gaze Behavior in the Automobile.,” in Workshop on Practical Experiences in Measuring and Modeling Drivers and Driver-Vehicle Interactions In conjunction with AutomotiveUI 2015, 2015.
  15. N. Flad, T. Fomina, H. H. Bülthoff, and L. L. Chuang, “In press: Unsupervised clustering of EOG as a viable substitute for optical eye-tracking,” First Workshop on Eye Tracking and Visualization at IEEE Visualization, 2015.
  16. L. L. Chuang, “Error visualization and information-seeking behavior for air-vehicle control.,” Foundations of Augmented Cognition. Lecture Notes in Artificial Intelligence, vol. 9183, pp. 3–11, 2015.
  17. N. Flad, J. Ditz, H. H. Bülthoff, and L. L. Chuang, “Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering,” Second Workshop on Eye Tracking and Visualization, IEEE Visualization 2016, 2016.
  18. M. Scheer, H. H. Bülthoff, and L. L. Chuang, “Steering Demands Diminish the Early-P3, Late-P3 and RON Components of the Event-Related Potential of Task-Irrelevant Environmental Sounds,” 2016, vol. 10, no. 73.