C03 | Immersive Virtual Environments

Dr. Lewis L. Chuang, LMU
Email | Website

Prof. Heinrich H. Bülthoff, Max Planck Institute for Biological Cybernetics
Email | Website

Lewis L. Chuang
Heinrich Bülthoff

Prof. Albrecht Schmidt, LMU
Email | Website

Albrecht Schmidt

Nina Flad, Max Planck Institute for Biological Cybernetics

Menja Scheer, Max Planck Institute for Biological Cybernetics

Alessandro Nesti, Max Planck Institute for Biological Cybernetics

Christiane Glatz, Max Planck Institute for Biological Cybernetics

Immersive virtual environments (IVEs) simulate real-world scenarios for purposes ranging from entertainment to safety-critical training (e.g., surgery, driving or flying). This project investigates how computationally demanding IVEs can be optimized to ensure realistic user performance, by focusing on how users seek out and process information for real-time operations. For this purpose, mobile gaze-tracking and EEG methods will be extended for use in IVEs. The context for research is aircraft simulation because it encompasses the rendering demands of a realistic world scene, intuitive visualization of in-vehicle instrumentation, and accurate synchronization of non-visual modalities (i.e., real motion).

Research Questions

How can users in IVEs be unobtrusively monitored without interrupting their activity?

How should visual information seeking and processing behavior be interpreted for the evaluation of visual computing?

How should visualizations be effectively combined with non-visual cues in a moving base simulator?

How are visualizations in IVE simulators relied on to support complex closed-loop control behavior?

Fig. 1: Immersive virtual environments provide visualisations in the form of a realistic world environment as well as abstract instruments. Both are important in supporting user interaction, especially in vehicle handling simulators (e.g., flight simulator). Gaze-tracking and electrophysiological recordings (e.g., EEG/ERP, heart-based measures) are respectively employed to evaluate how visual information is sought out and processed.

Publications

  1. A. Nesti, G. Rognini, B. Herbelin, H. H. Bülthoff, L. L. Chuang, and O. Blanke, “Modulation of Vection Latencies in the Full-Body Illusion,” PLoS ONE, vol. 13, no. 12, Art. no. 12, 2018, doi: 10.1371/journal.pone.0209189.
  2. C. Glatz and L. L. Chuang, “The Time Course of Auditory Looming Cues in Redirecting Visuo-Spatial Attention,” Nature - Scientific Reports, vol. 9, pp. 743:1-743:10, 2018, doi: 10.1038/s41598-018-36033-8.
  3. M. Scheer, H. H. Bülthoff, and L. L. Chuang, “Auditory Task Irrelevance: A Basis for Inattentional Deafness,” Human Factors, vol. 60, no. 3, Art. no. 3, 2018, doi: 10.1177/0018720818760919.
  4. S. S. Borojeni, S. C. J. Boll, W. Heuten, H. H. Bülthoff, and L. L. Chuang, “Feel the Movement: Real Motion Influences Responses to Take-Over Requests in Highly Automated Vehicles,” in Proceedings of the CHI Conference on Human Factors in Computing Systems, R. L. Mandryk, M. Hancock, M. Perry, and A. L. Cox, Eds., in Proceedings of the CHI Conference on Human Factors in Computing Systems. ACM, 2018, pp. 246:1-246:13. doi: 10.1145/3173574.3173820.
  5. T. Kosch, M. Funk, A. Schmidt, and L. L. Chuang, “Identifying Cognitive Assistance with Mobile Electroencephalography: A Case Study with In-Situ Projections for Manual Assembly.,” Proceedings of the ACM on Human-Computer Interaction (ACMHCI), vol. 2, pp. 11:1-11:20, 2018, doi: 10.1145/3229093.
  6. V. Schwind, P. Knierim, L. L. Chuang, and N. Henze, “‘Where’s Pinky?’: The Effects of a Reduced Number of Fingers in Virtual Reality,” in Proceedings of the Annual Symposium on Computer-Human Interaction in Play (CHI PLAY), B. A. M. Schouten, P. Markopoulos, Z. O. Toups, P. A. Cairns, and T. Bekker, Eds., in Proceedings of the Annual Symposium on Computer-Human Interaction in Play (CHI PLAY). ACM, 2017, pp. 507–515. doi: 10.1145/3116595.3116596.
  7. L. L. Chuang, C. Glatz, and S. S. Krupenia, “Using EEG to Understand why Behavior to Auditory In-vehicle Notifications Differs Across Test Environments,” in Proceedings of the International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI), S. Boll, B. Pfleging, B. Donmez, I. Politis, and D. R. Large, Eds., in Proceedings of the International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI). ACM, 2017, pp. 123–133. doi: 10.1145/3122986.3123017.
  8. K. de Winkel, A. Nesti, H. Ayaz, and H. Bülthoff, “Neural Correlates of Decision Making on Whole Body Yaw Rotation: an fNIRS Study,” Neuroscience Letters, vol. 654, pp. 56–62, 2017, doi: 10.1016/j.neulet.2017.04.053.
  9. A. Nesti, K. de Winkel, and H. Bülthoff, “Accumulation of Inertial Sensory Information in the Perception of Whole Body Yaw Rotation,” PloS ONE, vol. 12, no. 1, Art. no. 1, 2017, doi: 10.1371/journal.pone.0170497.
  10. J. Allsop, R. Gray, H. Bülthoff, and L. Chuang, “Eye Movement Planning on Single-Sensor-Single-Indicator Displays is Vulnerable to User Anxiety and Cognitive Load,” Journal of Eye Movement Research, vol. 10, no. 5, Art. no. 5, 2017, doi: 10.16910/jemr.10.5.8.
  11. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016. doi: 10.1007/978-3-319-47024-5_7.
  12. M. Scheer, H. H. Bülthoff, and L. L. Chuang, “Steering Demands Diminish the Early-P3, Late-P3 and RON Components of the Event-Related Potential of Task-Irrelevant Environmental Sounds,” in Frontiers in Human Neuroscience, F. in Human Neuroscience, Ed., in Frontiers in Human Neuroscience, vol. 10. 2016, pp. 73:1-73:15. doi: 10.3389/fnhum.2016.00073.
  13. N. Flad, J. C. Ditz, A. Schmidt, H. H. Bülthoff, and L. L. Chuang, “Data-Driven Approaches to Unrestricted Gaze-Tracking Benefit from Saccade Filtering,” in Proceedings of the Second Workshop on Eye Tracking and Visualization (ETVIS), M. Burch, L. L. Chuang, and A. T. Duchowski, Eds., in Proceedings of the Second Workshop on Eye Tracking and Visualization (ETVIS). IEEE, 2016, pp. 1–5. doi: 10.1109/ETVIS.2016.7851156.
  14. L. Lischke, S. Mayer, K. Wolf, N. Henze, H. Reiterer, and A. Schmidt, “Screen Arrangements and Interaction Areas for Large Display Work Places,” in Proceedings of the ACM International Symposium on Pervasive Displays (PerDis), ACM, Ed., in Proceedings of the ACM International Symposium on Pervasive Displays (PerDis), vol. 5. ACM, 2016, pp. 228–234. doi: 10.1145/2914920.2915027.
  15. L. L. Chuang and H. H. Bülthoff, “Towards a Better Understanding of Gaze Behavior in the Automobile,” in Position papers of the workshops at AutomotiveUI’15, in Position papers of the workshops at AutomotiveUI’15. Sep. 2015. [Online]. Available: https://www.auto-ui.org/15/p/workshops/2/8_Towards%20a%20Better%20Understanding%20of%20Gaze%20Behavior%20in%20the%20Automobile_Chuang.pdf
  16. L. L. Chuang, “Error Visualization and Information-Seeking Behavior for Air-Vehicle Control,” in Foundations of Augmented Cognition. AC 2015. Lecture Notes in Computer Science, vol. 9183, D. Schmorrow and C. M. Fidopiastis, Eds., in Foundations of Augmented Cognition. AC 2015. Lecture Notes in Computer Science, vol. 9183. , Springer, 2015, pp. 3–11. doi: 10.1007/978-3-319-20816-9_1.
  17. N. Flad, T. Fomina, H. H. Bülthoff, and L. L. Chuang, “Unsupervised Clustering of EOG as a Viable Substitute for Optical Eye Tracking,” in Eye Tracking and Visualization: Foundations, Techniques, and Applications, M. Burch, L. L. Chuang, B. D. Fisher, A. Schmidt, and D. Weiskopf, Eds., in Eye Tracking and Visualization: Foundations, Techniques, and Applications. , Springer International Publishing, 2015, pp. 151–167. doi: 10.1007/978-3-319-47024-5_9.

Project Group A

Models and Measures

 

Completed

 

Project Group B

Adaptive Algorithms

 

Completed

 

Project Group C

Interaction

 

Completed

 

Project Group D

Applications

 

Completed