A. V. Reinschluessel and J. Zagermann, “Exploring Hybrid User Interfaces for Surgery Planning,” in
2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). 2023, pp. 208–210. doi:
10.1109/ISMAR-Adjunct60411.2023.00048.
Abstract
Hybrid user interfaces are a great opportunity to combine complementary interfaces to make use of the best interface for specific steps in a workflow. This position paper outlines one diverse application field: surgery planning. Planning a surgery is a complex task as the surgical team has to get an overview and understanding of a patient’s medical history and the internal anatomical structures of the organ or region of interest. In this position paper, we outline how different hardware (e.g., mixed reality head-worn devices and physical objects) and interaction concepts (e.g., gesture-based interaction or keyboard and mouse) can create an optimal workflow for surgery planning.BibTeX
J. Zagermann, S. Hubenschmid, D. Fink, J. Wieland, H. Reiterer, and T. Feuchtner, “Challenges and Opportunities for Collaborative Immersive Analytics with Hybrid User Interfaces,” in
2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). Los Alamitos, CA, USA: IEEE Computer Society, Oct. 2023, pp. 191–195. doi:
10.1109/ISMAR-Adjunct60411.2023.00044.
Abstract
Over the past years, we have seen an increase in the number of user studies involving mixed reality interfaces. As these environments usually exceed standardized user study settings that only measure time and error, we developed, designed, and evaluated a mixed-immersion evaluation framework called ReLive. Its combination of in-situ and ex-situ analysis approaches allows for the holistic and malleable analysis and exploration of mixed reality user study data of an individual analyst in a step-by-step approach that we previously described as an asynchronous hybrid user interface. Yet, collaboration was coined as a key aspect for visual and immersive analytics – potentially allowing multiple analysts to synchronously explore mixed reality user study data from different but complementary angles of evaluation using hybrid user interfaces. This leads to a variety of fundamental challenges and opportunities for research and design of hybrid user interfaces regarding e.g., allocation of tasks, the interplay between views, user representations, and collaborative coupling that are outlined in this position paper.BibTeX
S. Hubenschmid, J. Zagermann, D. Leicht, H. Reiterer, and T. Feuchtner, “ARound the Smartphone: Investigating the Effects of Virtually-Extended Display Size on Spatial Memory,” in
Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), in Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23). New York, NY, USA: ACM, 2023. doi:
https://doi.org/10.1145/3544548.3581438.
Abstract
Smartphones conveniently place large information spaces in the palms of our hands. While research has shown that larger screens positively affect spatial memory, workload, and user experience, smartphones remain fairly compact for the sake of device ergonomics and portability. Thus, we investigate the use of hybrid user interfaces to virtually increase the available display size by complementing the smartphone with an augmented reality head-worn display. We thereby combine the benefits of familiar touch interaction with the near-infinite visual display space afforded by augmented reality. To better understand the potential of virtually-extended displays and the possible issues of splitting the user’s visual attention between two screens (real and virtual), we conducted a within-subjects experiment with 24 participants completing navigation tasks using different virtually-augmented display sizes. Our findings reveal that a desktop monitor size represents a “sweet spot” for extending smartphones with augmented reality, informing the design of hybrid user interfaces.BibTeX
A. Zaky, J. Zagermann, H. Reiterer, and T. Feuchtner, “Opportunities and Challenges of Hybrid User Interfaces for Optimization of Mixed Reality Interfaces,” in
2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), in 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). 2023, pp. 215–219. doi:
10.1109/ISMAR-Adjunct60411.2023.00050.
Abstract
Current research highlights the importance of adaptive mixed reality interfaces, as increased adoption leads to increasingly diverse, complex and unconstrained interaction scenarios. An interesting approach for adaptation, is the optimization of interface layout and behaviour. We thereby consider three distinct types of context to which the interface adapts: the user, the activity, and the environment. The latter of these includes a myriad of interactive devices surrounding the user, the capabilities of which we propose to take advantage of by integrating them in a hybrid user interface. Hybrid user interfaces offer many opportunities to address distinct usability issues, such as visibility, reachability, and ergonomics. However, considering additional interactive devices for optimizing mixed reality interfaces introduces a number of additional challenges, such as detecting available and suitable devices and modeling the respective interaction costs. Moreover, using different devices potentially introduces a switching cost e.g., in terms of cognitive load and time. In this paper, we aim to discuss different opportunities and challenges of using hybrid user interfaces for the optimization of mixed reality interfaces and thereby highlight directions for future work.BibTeX
F. Chiossi
et al., “Adapting visualizations and interfaces to the user,”
it - Information Technology, vol. 64, no. 4–5, Art. no. 4–5, 2022, doi:
10.1515/itit-2022-0035.
Abstract
Adaptive visualization and interfaces pervade our everyday tasks to improve interaction from the point of view of user performance and experience. This approach allows using several user inputs, whether physiological, behavioral, qualitative, or multimodal combinations, to enhance the interaction. Due to the multitude of approaches, we outline the current research trends of inputs used to adapt visualizations and user interfaces. Moreover, we discuss methodological approaches used in mixed reality, physiological computing, visual analytics, and proficiency-aware systems. With this work, we provide an overview of the current research in adaptive systems.BibTeX
J. Zagermann
et al., “Complementary Interfaces for Visual Computing,”
it - Information Technology, vol. 64, no. 4–5, Art. no. 4–5, 2022, doi:
doi:10.1515/itit-2022-0031.
Abstract
With increasing complexity in visual computing tasks, a single device may not be sufficient to adequately support the user’s workflow. Here, we can employ multi-device ecologies such as cross-device interaction, where a workflow can be split across multiple devices, each dedicated to a specific role. But what makes these multi-device ecologies compelling? Based on insights from our research, each device or interface component must contribute a complementary characteristic to increase the quality of interaction and further support users in their current activity. We establish the term complementary interfaces for such meaningful combinations of devices and modalities and provide an initial set of challenges. In addition, we demonstrate the value of complementarity with examples from within our own research.BibTeX