Immersive Analytics (IA) is an emerging field that studies technologies facilitating a deep cognitive, perceptual and/or emotional involvement of humans when understanding and reasoning with data. The goal of this project is to investigate and quantify the impact of such technologies on immersion, and the role of immersion for data analytics. We aim to further develop the Immersive Analytics methodology and investigate the applicability of IA approaches to research tasks in the life sciences, with a particular focus on quantitative aspects of immersive analytics. We will design immersive environments for selected applications of the life sciences and develop new methodologies that allow us to put the human in the loop for an immersive experience during an analytics workflow.
How can we quantify immersion in an analytics process, and how can we quantify the impact of immersion?
How can we best support analytics and decision making tasks with Immersive Analytics approaches facilitated by new technologies?
What are new potentials and benets that IA brings for tasks in the life sciences, and how can we quantify them?
Fig. 1: Immersive Analytics (IA).
D. Bienroth et al., “Spatially resolved transcriptomics in immersive environments,” Visual Computing for Industry, Biomedicine, and Art, vol. 5, no. 1, Art. no. 1, 2022, doi: 10.1186/s42492-021-00098-6.
Spatially resolved transcriptomics is an emerging class of high-throughput technologies that enable biologists to systematically investigate the expression of genes along with spatial information. Upon data acquisition, one major hurdle is the subsequent interpretation and visualization of the datasets acquired. To address this challenge, VR-Cardiomicsis presented, which is a novel data visualization system with interactive functionalities designed to help biologists interpret spatially resolved transcriptomic datasets. By implementing the system in two separate immersive environments, fish tank virtual reality (FTVR) and head-mounted display virtual reality (HMD-VR), biologists can interact with the data in novel ways not previously possible, such as visually exploring the gene expression patterns of an organ, and comparing genes based on their 3D expression profiles. Further, a biologist-driven use-case is presented, in which immersive environments facilitate biologists to explore and compare the heart expression profiles of different genes.
Abstract After a long period of scepticism, more and more publications describe basic research but also practical approaches to how abstract data can be presented in immersive environments for effective and efficient data understanding. Central aspects of this important research question in immersive analytics research are concerned with the use of 3D for visualization, the embedding in the immersive space, the combination with spatial data, suitable interaction paradigms and the evaluation of use cases. We provide a characterization that facilitates the comparison and categorization of published works and present a survey of publications that gives an overview of the state of the art, current trends, and gaps and challenges in current research.
M. Kraus, K. Klein, J. Fuchs, D. A. Keim, F. Schreiber, and M. Sedlmair, “The Value of Immersive Visualization,” IEEE Computer Graphics and Applications (CG&A), vol. 41, no. 4, Art. no. 4, 2021, doi: 10.1109/MCG.2021.3075258.
In recent years, research on immersive environments has experienced a new wave of interest, and immersive analytics has been established as a new research field. Every year, a vast amount of different techniques, applications, and user studies are published that focus on employing immersive environments for visualizing and analyzing data. Nevertheless, immersive analytics is still a relatively unexplored field that needs more basic research in many aspects and is still viewed with skepticism. Rightly so, because in our opinion, many researchers do not fully exploit the possibilities offered by immersive environments and, on the contrary, sometimes even overestimate the power of immersive visualizations. Although a growing body of papers has demonstrated individual advantages of immersive analytics for specific tasks and problems, the general benefit of using immersive environments for effective analytic tasks remains controversial. In this article, we reflect on when and how immersion may be appropriate for the analysis and present four guiding scenarios. We report on our experiences, discuss the landscape of assessment strategies, and point out the directions where we believe immersive visualizations have the greatest potential.
Large metabolic models, including genome-scale metabolic models, are nowadays common in systems biology, biotechnology and pharmacology. They typically contain thousands of metabolites and reactions and therefore methods for their automatic visualization and interactive exploration can facilitate a better understanding of these models.We developed a novel method for the visual exploration of large metabolic models and implemented it in LMME (Large Metabolic Model Explorer), an add-on for the biological network analysis tool VANTED. The underlying idea of our method is to analyze a large model as follows. Starting from a decomposition into several subsystems, relationships between these subsystems are identified and an overview is computed and visualized. From this overview, detailed subviews may be constructed and visualized in order to explore subsystems and relationships in greater detail. Decompositions may either be predefined or computed, using built-in or self-implemented methods. Realized as add-on for VANTED, LMME is embedded in a domain-specific environment, allowing for further related analysis at any stage during the exploration. We describe the method, provide a use case and discuss the strengths and weaknesses of different decomposition methods.The methods and algorithms presented here are implemented in LMME, an open-source add-on for VANTED. LMME can be downloaded from www.cls.uni-konstanz.de/software/lmme and VANTED can be downloaded from www.vanted.org. The source code of LMME is available from GitHub, at https://github.com/LSI-UniKonstanz/lmme.
The human microbiome is largely shaped by the chemical interactions of its microbial members, which includes cross-talk via shared signals or quenching of the signalling of other species. Quorum sensing is a process that allows microbes to coordinate their behaviour in dependence of their population density and to adjust gene expression accordingly. We present the Quorum Sensing Database (QSDB), a comprehensive database of all published sensing and quenching relations between organisms and signalling molecules of the human microbiome, as well as an interactive web interface that allows browsing the database, provides graphical depictions of sensing mechanisms as Systems Biology Graphical Notation diagrams and links to other databases.Database URL: QSDB (Quorum Sensing DataBase) is freely available via an interactive web interface and as a downloadable csv file at http://qsdb.org.
K. Klein, M. Aichem, Y. Zhang, S. Erk, B. Sommer, and F. Schreiber, “TEAMwISE : synchronised immersive environments for exploration and analysis of animal behaviour,” Journal of Visualization, 2021, doi: 10.1007/s12650-021-00746-2.
The recent availability of affordable and lightweight tracking sensors allows researchers to collect large and complex movement data sets. To explore and analyse these data, applications are required that are capable of handling the data while providing an environment that enables the analyst(s) to focus on the task of investigating the movement in the context of the geographic environment it occurred in. We present an extensible, open-source framework for collaborative analysis of geospatial–temporal movement data with a use case in collective behaviour analysis. The framework TEAMwISE supports the concurrent usage of several program instances, allowing to have different perspectives on the same data in collocated or remote set-ups. The implementation can be deployed in a variety of immersive environments, for example, on a tiled display wall and mobile VR devices.
Current tracking technology such as GPS data loggers allows biologists to remotely collect large amounts of movement data for a large variety of species. Extending, and often replacing interpretation based on observation, the analysis of the collected data supports research on animal behaviour, on impact factors such as climate change and human intervention on the globe, as well as on conservation programs. However, this analysis is difficult, due to the nature of the research questions and the complexity of the data sets. It requires both automated analysis, for example, for the detection of behavioural patterns, and human inspection, for example, for interpretation, inclusion of previous knowledge, and for conclusions on future actions and decision making. For this analysis and inspection, the movement data needs to be put into the context of environmental data, which helps to interpret the behaviour. Thus, a major challenge is to design and develop methods and intuitive interfaces that integrate the data for analysis by biologists. We present a concept and implementation for the visual analysis of cheetah movement data in a web-based fashion that allows usage both in the field and in office environments.
The Tiled Stereoscopic 3D Display Wall (TS3DW) is a monitor system consisting of six consumer 3D TVs. Two monitors reside on a mobile display mount. One standard configuration is to use them in a 135-degree angle to each other, having one mobile mount in the center, and one at each
side. In this way, the system can be transported to multiple locations across a campus as well as used in different application scenarios. This system was already used for a number of research projects and presentations.
K. Klein, M. Aichem, B. Sommer, S. Erk, Y. Zhang, and F. Schreiber, “TEAMwISE: Synchronised Immersive Environments for Exploration and Analysis of Movement Data,” in Proceedings of the ACM Symposium on Visual Information Communication and Interaction (VINCI), 2019, pp. 9:1-9:5. doi: 10.1145/3356422.3356450.
The recent availability of affordable and lightweight tracking sensors allows researchers to collect large and complex movement datasets. These datasets require applications that are capable of handling them whilst providing an environment that enables the analyst(s) to focus on the task of analysing the movement in the context of the geographic environment it occurred in. We present a framework for collaborative analysis of geospatial-temporal movement data with a use-case in collective behavior analysis. It supports the concurrent usage of several program instances, allowing to have different perspectives on the same data in collocated or remote setups. The implementation can be deployed in a variety of immersive environments, e.g. on a tiled display wall or mobile VR devices.
Analysing and understanding brain function and disorder is the main focus of neuroscience. Due to the high complexity of the brain, directionality of the signal and changing activity over time, visual exploration and data analysis are difficult. For this reason, a vast amount of research challenges are still unsolved. We explored different challenges of the visual analysis of brain data and the design of corresponding immersive environments in collaboration with experts from the biomedical domain. We built a prototype of an immersive virtual reality environment to explore the design space and to investigate how brain data analysis can be supported by a variety of design choices. Our environment can be used to study the effect of different visualisations and combinations of brain data representation, as for example network layouts, anatomical mapping or time series. As a long-term goal, we aim to aid neuro-scientists in a better understanding of brain function and disorder.
K. Klein et al., “Fly with the flock : immersive solutions for animal movement visualization and analytics,” Journal of the Royal Society Interface, vol. 16, no. 153, Art. no. 153, 2019, doi: 10.1098/rsif.2018.0794.
Understanding the movement of animals is important for a wide range of scientific interests including migration, disease spread, collective movement behaviour and analysing motion in relation to dynamic changes of the environment such as wind and thermal lifts. Particularly, the three-dimensional (3D) spatial–temporal nature of bird movement data, which is widely available with high temporal and spatial resolution at large volumes, presents a natural option to explore the potential of immersive analytics (IA). We investigate the requirements and benefits of a wide range of immersive environments for explorative visualization and analytics of 3D movement data, in particular regarding design considerations for such 3D immersive environments, and present prototypes for IA solutions. Tailored to biologists studying bird movement data, the immersive solutions enable geo-locational time-series data to be investigated interactively, thus enabling experts to visually explore interesting angles of a flock and its behaviour in the context of the environment. The 3D virtual world presents the audience with engaging and interactive content, allowing users to ‘fly with the flock’, with the potential to ascertain an intuitive overview of often complex datasets, and to provide the opportunity thereby to formulate and at least qualitatively assess hypotheses. This work also contributes to ongoing research efforts to promote better understanding of bird migration and the associated environmental factors at the global scale, thereby providing a visual vehicle for driving public awareness of environmental issues and bird migration patterns.
Recent advances in tracking technology allow biologists to collect large amounts of movement data for a variety of species. Analysis of the collected data supports research on animal behaviour, influence of impact factors such as climate change and human intervention, as well as conservation programs. Analysis of the movement data is difficult, due to the nature of the research questions and the complexity of the data sets. It requires both automated analysis, e.g. for the detection of behavioural patterns, and human inspection, e.g. for interpretation, inclusion of previous knowledge, and for conclusions on future actions and decision making. We present a concept and implementation for the visual analysis of cheetah movement data in a web-based fashion that allows usage both in the field and in office environments.
The emerging field of Immersive Analytics investigates how novel display and interaction technologies can enable people to better explore and analyse data and complex information. Collaboration is a crucial aspect of Immersive Analytics. In this paper we present ContextuWall, a system for interactive local and remote collaboration using touch and mobile devices as well as displays of various sizes. The system enables groups of users located on different sites to share content to a jointly used virtual desktop which is accessible over a secured network. This virtual desktop can be shown on different large displays simultaneously, taking advantage of their high resolution. To enable users to intuitively share, arrange as well as annotate image content, a purpose-built client software has been developed and can easily be adapted with plug-ins for existing data analytics software. We show exemplary use cases and describe the system architecture and its implementation.
Pseudomonas aeruginosa often causes multidrug-resistant infections in immunocompromised patients, and polymyxins are often used as the last-line therapy. Alarmingly, resistance to polymyxins has been increasingly reported worldwide recently. To rescue this last-resort class of antibiotics, it is necessary to systematically understand how P. aeruginosa alters its metabolism in response to polymyxin treatment, thereby facilitating the development of effective therapies. To this end, a genome-scale metabolic model (GSMM) was used to analyze bacterial metabolic changes at the systems level.
A high-quality GSMM iPAO1 was constructed for P. aeruginosa PAO1 for antimicrobial pharmacological research. Model iPAO1 encompasses an additional periplasmic compartment and contains 3022 metabolites, 4265 reactions, and 1458 genes in total. Growth prediction on 190 carbon and 95 nitrogen sources achieved an accuracy of 89.1%, outperforming all reported P. aeruginosa models. Notably, prediction of the essential genes for growth achieved a high accuracy of 87.9%. Metabolic simulation showed that lipid A modifications associated with polymyxin resistance exert a limited impact on bacterial growth and metabolism but remarkably change the physiochemical properties of the outer membrane. Modeling with transcriptomics constraints revealed a broad range of metabolic responses to polymyxin treatment, including reduced biomass synthesis, upregulated amino acid catabolism, induced flux through the tricarboxylic acid cycle, and increased redox turnover.
Overall, iPAO1 represents the most comprehensive GSMM constructed to date for Pseudomonas. It provides a powerful systems pharmacology platform for the elucidation of complex killing mechanisms of antibiotics.
M. de Ridder, K. Klein, and J. Kim, “A Review and Outlook on Visual Analytics for Uncertainties in Functional Magnetic Resonance Imaging,” Brain Informatics, vol. 5, no. 2, Art. no. 2, 2018, doi: 10.1186/s40708-018-0083-0.
Analysis of functional magnetic resonance imaging (fMRI) plays a pivotal role in uncovering an understanding of the brain. fMRI data contain both spatial volume and temporal signal information, which provide a depiction of brain activity. The analysis pipeline, however, is hampered by numerous uncertainties in many of the steps; often seen as one of the last hurdles for the domain. In this review, we categorise fMRI research into three pipeline phases: (i) image acquisition and processing; (ii) image analysis; and (iii) visualisation and human interpretation, to explore the uncertainties that arise in each phase, including the compound effects due to the inter-dependence of steps. Attempts at mitigating uncertainties rely on providing interactive visual analytics that aid users in understanding the effects of the uncertainties and adjusting their analyses. This impetus for visual analytics comes in light of considerable research investigating uncertainty throughout the pipeline. However, to the best of our knowledge, there is yet to be a comprehensive review on the importance and utility of uncertainty visual analytics (UVA) in addressing fMRI concerns, which we term fMRI-UVA. Such techniques have been broadly implemented in related biomedical fields, and its potential for fMRI has recently been explored; however, these attempts are limited in their scope and utility, primarily focussing on addressing small parts of single pipeline phases. Our comprehensive review of the fMRI uncertainties from the perspective of visual analytics addresses the three identified phases in the pipeline. We also discuss the two interrelated approaches for future research opportunities for fMRI-UVA.
V. Yoghourdjian, T. Dwyer, K. Klein, K. Marriott, and M. Wybrow, “Graph Thumbnails: Identifying and Comparing Multiple Graphs at a Glance,” IEEE Transactions on Visualization and Computer Graphics, vol. 24, no. 12, Art. no. 12, 2018, doi: 10.1109/TVCG.2018.2790961.
We propose Graph Thumbnails, small icon-like visualisations of the high-level structure of network data. Graph Thumbnails are designed to be legible in small multiples to support rapid browsing within large graph corpora. Compared to existing graph-visualisation techniques our representation has several advantages: (1) the visualisation can be computed in linear time; (2) it is canonical in the sense that isomorphic graphs will always have identical thumbnails; and (3) it provides precise information about the graph structure. We report the results of two user studies. The first study compares Graph Thumbnails to node-link and matrix views for identifying similar graphs. The second study investigates the comprehensibility of the different representations. We demonstrate the usefulness of this representation for summarising the evolution of protein-protein interaction networks across a range of species.
Immersive Analytics is a new research initiative that aims to remove barriers between people, their data and the tools they use for analysis and decision making. Here the aims of immersive analytics research are clarified, its opportunities and historical context, as well as providing a broad research agenda for the field. In addition, it is reviewed how the term immersion has been used to refer to both technological and psychological immersion, both of which are central to immersive analytics research.
M. Ghaffar et al., “3D Modelling and Visualisation of Heterogeneous Cell Membranes in Blender,” in Proceedings of the 11th International Symposium on Visual Information Communication and Interaction, Växjö, Sweden, 2018, pp. 64–71. doi: 10.1145/3231622.3231639.
Chlamydomonas reinhardtii cells have been in the focus of research for more than a decade, in particular due to its use as alternative source for energy production. However, the molecular processes in these cells are still not completely known, and 3D visualisations may help to understand these complex interactions and processes. In previous work, we presented the stereoscopic 3D (S3D) visualisation of a complete Chlamydomonas reinhardtii cell created with the 3D modelling framework Blender. This animation contained already a scene showing an illustrative membrane model of the thylakoid membrane. During discussion with domain experts, shortcomings of the visualisation for several detailed analysis questions have been identified and it was decided to redefine it.A new modelling and visualisation pipeline based on a Membrane Packing Algorithm was developed, which can be configured via a user interface, enabling the composition of membranes employing published material. An expert user study was conducted to evaluate this new approach, with half the participants having a biology and the other half having an informatics background. The new and old Chlamydomonas thylakoid membrane models were presented on a S3D back projection system. The evaluation results reveal that the majority of participants preferred the new, more realistic membrane visualisation. However, the opinion varied with the expertise, leading to valuable conclusions for future visualisations. Interestingly, the S3D presentation of molecular structures lead to a positive change in opinion regarding S3D technology.
M. de Ridder, K. Klein, and J. Kim, “Temporaltracks: Visual Analytics for Exploration of 4D fMRI Time-series Coactivation,” in Proceedings of the Computer Graphics International Conference (CGI), 2017, pp. 13:1-13:6. doi: 10.1145/3095140.3095153.
Functional magnetic resonance imaging (fMRI) is a 4D medical imaging modality that depicts a proxy of neuronal activity in a series of temporal scans. Statistical processing of the modality shows promise in uncovering insights about the functioning of the brain, such as the default mode network, and characteristics of mental disorders. Current statistical processing generally summarises the temporal signals between brain regions into a single data point to represent the 'coactivation' of the regions. That is, how similar are their temporal patterns over the scans. However, the potential of such processing is limited by issues of possible data misrepresentation due to uncertainties, e.g. noise in the data. Moreover, it has been shown that brain signals are characterised by brief traces of coactivation, which are lost in the single value representations. To alleviate the issues, alternate statistical processes have been used, however creating effective techniques has proven difficult due to problems, e.g. issues with noise, which often require user input to uncover. Visual analytics, therefore, through its ability to interactively exploit human expertise, presents itself as an interesting approach of benefit to the domain. In this work, we present the conceptual design behind TemporalTracks, our visual analytics system for exploration of 4D fMRI time-series coactivation data, utilising a visual metaphor to effectively present coactivation data for easier understanding. We describe our design with a case study visually analysing Human Connectome Project data, demonstrating that TemporalTracks can uncover temporal events that would otherwise be hidden in standard analysis.
Recent advances in miniaturising sensor tags allow to obtain high-resolution bird trajectories, presenting an opportunity for immersive close-up observation of individual and group behaviour in mid-air. The combination of geographical, environmental, and movement data is well suited for investigation in immersive analytics environments. We explore the benefits and requirements of a wide range of such environments, and illustrate a multi-platform immersive analytics solution, based on a tiled 3D display wall and head-mounted displays (Google Cardboard, HTC Vive and Microsoft Hololens). Tailored to biologists studying bird movement data, the immersive environment provides a novel interactive mode to explore the geolocational time-series data. This paper aims to inform the 3D visualisation research community about design considerations obtained from a real world data set in different 3D immersive environments. This work also contributes to ongoing research efforts to promote better understanding of bird migration and the associated environmental factors at the planet-level scale, thereby capturing the public awareness of environmental issues.
Immersive Analytics is an emerging research thrust investigating how new interaction and display technologies can be used to support analytical reasoning and decision making. The aim is to provide multi-sensory interfaces that support collaboration and allow users to immerse themselves in their data in a way that supports real-world analytics tasks. Immersive Analytics builds on technologies such as large touch surfaces, immersive virtual and augmented reality environments, sensor devices and other, rapidly evolving, natural user interface devices. While there is a great deal of past and current work on improving the display technologies themselves, our focus in this position paper is on bringing attention to the higher-level usability and design issues in creating effective user interfaces for data analytics in immersive environments.