C07 | Optimization for Dynamic Mixed Reality User Interfaces

Jun.- Prof. Tiare Feuchtner, University of Konstanz
Email | Website

Tiare Feuchtner

Jun.- Prof. Sven Mayer, LMU Munich
Email | Website

New image

Abdelrahman Zaky, University of Konstanz – Email | Website

Dr. Anke Reinschlüssel, University of Konstanz – Email | Website

In this project we aim to dynamically adapt the user interface during interaction with Cross-Reality (XR) applications through head-mounted displays (HMDs), to improve usability and ensure the users safety and comfort.

XR refers to any systems that immerse the user in an interactive virtual or virtually augmented environment, such as Virtual Reality (VR) or Augmented Reality (AR). The virtual content is often placed in the space around the user’s body and can be directly manipulated with hands or controllers. This shift of virtual content from the 2D space of the computer screen to the 3D environment brings along a myriad of new challenges in user interface (UI) and interaction design, due to the complex and dynamic context of interaction. Contexts to consider include the user, the physical environment, and the activity. For example, an AR application presenting a virtual button in mid-air in front of the user may occlude their view of their conversation partner, lead them to knock over the glass of water on the table when trying to reach it, and cause muscle fatigue if interacted with continuously.

We aim to address these challenges by optimizing 3D UIs with regards to the placement of information and interactive elements, as well as the employed interaction techniques and feedback, depending on the context of interaction.

Research Questions

What UI characteristics can and should be adapted and how (e.g., placement, appearance, interaction technique, user representation, immersion, physicality)?

What are meaningful characteristics of the interaction context and how can they be quantified to facilitate real-time computation of optimization functions?

How can we handle conflicting optimization objectives and make their resolution apparent to the designer or user?

How much customizability does the user need during interaction, ranging from pre-defined modes to a fully adaptive system that learns over time?

How can we adapt the UI to effectively support collaboration and social interaction between multiple users?

What are meaningful and measurable success criteria for validating optimization results in user studies?

C07-image1

Fig. 1: Common mid-air interaction with head-mounted Augmented Reality systems

C07-image2

Fig. 2: Illustration of ergonomic cost in the user’s interaction space

Publications

  1. A. V. Reinschluessel and J. Zagermann, “Exploring Hybrid User Interfaces for Surgery Planning,” in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). 2023, pp. 208–210. doi: 10.1109/ISMAR-Adjunct60411.2023.00048.
  2. J. Zagermann, S. Hubenschmid, D. Fink, J. Wieland, H. Reiterer, and T. Feuchtner, “Challenges and Opportunities for Collaborative Immersive Analytics with Hybrid User Interfaces,” in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). Los Alamitos, CA, USA: IEEE Computer Society, Oct. 2023, pp. 191–195. doi: 10.1109/ISMAR-Adjunct60411.2023.00044.
  3. S. Hubenschmid, J. Zagermann, D. Leicht, H. Reiterer, and T. Feuchtner, “ARound the Smartphone: Investigating the Effects of Virtually-Extended Display Size on Spatial Memory,” in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23). New York, NY, USA: ACM, 2023. doi: https://doi.org/10.1145/3544548.3581438.
  4. A. Zaky, J. Zagermann, H. Reiterer, and T. Feuchtner, “Opportunities and Challenges of Hybrid User Interfaces for Optimization of Mixed Reality Interfaces,” in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). 2023, pp. 215–219. doi: 10.1109/ISMAR-Adjunct60411.2023.00050.
  5. F. Chiossi et al., “Adapting visualizations and interfaces to the user,” it - Information Technology, vol. 64, no. 4–5, Art. no. 4–5, 2022, doi: 10.1515/itit-2022-0035.
  6. J. Zagermann et al., “Complementary Interfaces for Visual Computing,” it - Information Technology, vol. 64, no. 4–5, Art. no. 4–5, 2022, doi: doi:10.1515/itit-2022-0031.

Project Group A

Models and Measures

 

Completed

 

Project Group B

Adaptive Algorithms

 

Completed

 

Project Group C

Interaction

 

Completed

 

Project Group D

Applications

 

Completed