C03 | Immersive Virtual Environments

Dr. Lewis L. Chuang, Max Planck Institute for Biological Cybernetics
Email | Website

Prof. Heinrich H. Bülthoff, Max Planck Institute for Biological Cybernetics
Email | Website

Lewis L. Chuang
Heinrich H. Bülthoff

Prof. Albrecht Schmidt, Universität Stuttgart
Email | Website

Albrecht Schmidt

Nina Flad, Max Planck Institute for Biological Cybernetics – Website

Menja Scheer, Max Planck Institute for Biological Cybernetics – Website

Alessandro Nesti, Max Planck Institute for Biological Cybernetics – Website | Email

Christiane Glatz, Max Planck Institute for Biological Cybernetics – Website | Email


Immersive virtual environments (IVEs) simulate real-world scenarios for purposes ranging from entertainment to safety-critical training (e.g., surgery, driving or flying). This project investigates how computationally demanding IVEs can be optimized to ensure realistic user performance, by focusing on how users seek out and process information for real-time operations. For this purpose, mobile gaze-tracking and EEG methods will be extended for use in IVEs. The context for research is aircraft simulation because it encompasses the rendering demands of a realistic world scene, intuitive visualization of in-vehicle instrumentation, and accurate synchronization of non-visual modalities (i.e., real motion).

Research Questions

How can users in IVEs be unobtrusively monitored without interrupting their activity?

How should visual information seeking and processing behavior be interpreted for the evaluation of visual computing?

How should visualizations be effectively combined with non-visual cues in a moving base simulator?

How are visualizations in IVE simulators relied on to support complex closed-loop control behavior?

Immersive virtual environments provide visualisations in the form of a realistic world environment as well as abstract instruments. Both are important in supporting user interaction, especially in vehicle handling simulators (e.g., flight simulator). Gaze-tracking and electrophysiological recordings (e.g., EEG/ERP, heart-based measures) are respectively employed to evaluate how visual information is sought out and processed.


  1. K. de Winkel, A. Nesti, H. Ayaz, and H. Bülthoff, “Neural correlates of decision making on whole body yaw rotation: an fNIRS study,” Neuroscience Letters, 2017.
  2. L. Lischke, S. Mayer, K. Wolf, N. Henze, H. Reiterer, and A. Schmidt, “Screen arrangements and interaction areas for large display work places.,” vol. Proceedings of the 5th ACM International Symposium on Pervasive Displays PerDis 2016, pp. 228–234, 2016.
  3. A. Nesti, K. de Winkel, and H. Bülthoff, “Accumulation of inertial sensory information in the perception of whole body yaw rotation,” One, Plos, 2017.
  4. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016.
  5. C. L. L. and B. H. H., “Towards a Better Understanding of Gaze Behavior in the Automobile.,” in Workshop on Practical Experiences in Measuring and Modeling Drivers and Driver-Vehicle Interactions In conjunction with AutomotiveUI 2015, 2015.
  6. N. Flad, T. Fomina, H. H. Bülthoff, and L. L. Chuang, “In press: Unsupervised clustering of EOG as a viable substitute for optical eye-tracking,” First Workshop on Eye Tracking and Visualization at IEEE Visualization, 2015.
  7. L. L. Chuang, “Error visualization and information-seeking behavior for air-vehicle control.,” Foundations of Augmented Cognition. Lecture Notes in Artificial Intelligence, vol. 9183, pp. 3–11, 2015.
  8. N. Flad, J. Ditz, H. H. Bülthoff, and L. L. Chuang, “Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering,” Second Workshop on Eye Tracking and Visualization, IEEE Visualization 2016, 2016.
  9. M. Scheer, H. H. Bülthoff, and L. L. Chuang, “Steering Demands Diminish the Early-P3, Late-P3 and RON Components of the Event-Related Potential of Task-Irrelevant Environmental Sounds,” 2016, vol. 10, no. 73.