L. Joos, S. Jaeger-Honz, F. Schreiber, D. A. Keim, and K. Klein, “Visual Comparison of Networks in VR,”
IEEE Transactions on Visualization and Computer Graphics, vol. 28, no. 11, Art. no. 11, 2022, doi:
10.1109/TVCG.2022.3203001.
Abstract
Networks are an important means for the representation and analysis of data in a variety of research and application areas. While there are many efficient methods to create layouts for networks to support their visual analysis, approaches for the comparison of networks are still underexplored. Especially when it comes to the comparison of weighted networks, which is an important task in several areas, such as biology and biomedicine, there is a lack of efficient visualization approaches. With the availability of affordable high-quality virtual reality (VR) devices, such as head-mounted displays (HMDs), the research field of immersive analytics emerged and showed great potential for using the new technology for visual data exploration. However, the use of immersive technology for the comparison of networks is still underexplored. With this work, we explore how weighted networks can be visually compared in an immersive VR environment and investigate how visual representations can benefit from the extended 3D design space. For this purpose, we develop different encodings for 3D node-link diagrams supporting the visualization of two networks within a single representation and evaluate them in a pilot user study. We incorporate the results into a more extensive user study comparing node-link representations with matrix representations encoding two networks simultaneously. The data and tasks designed for our experiments are similar to those occurring in real-world scenarios. Our evaluation shows significantly better results for the node-link representations, which is contrary to comparable 2D experiments and indicates a high potential for using VR for the visual comparison of networks.BibTeX
F. Schreiber and D. Weiskopf, “Quantitative Visual Computing,”
it - Information Technology, vol. 64, no. 4–5, Art. no. 4–5, 2022, doi:
doi:10.1515/itit-2022-0048.
BibTeX
Y. Zhang, K. Klein, O. Deussen, T. Gutschlag, and S. Storandt, “Robust Visualization of Trajectory Data,”
it - Information Technology, vol. 64, no. 4–5, Art. no. 4–5, 2022, doi:
doi:10.1515/itit-2022-0036.
Abstract
The analysis of movement trajectories plays a central role in many application areas, such as traffic management, sports analysis, and collective behavior research, where large and complex trajectory data sets are routinely collected these days. While automated analysis methods are available to extract characteristics of trajectories such as statistics on the geometry, movement patterns, and locations that might be associated with important events, human inspection is still required to interpret the results, derive parameters for the analysis, compare trajectories and patterns, and to further interpret the impact factors that influence trajectory shapes and their underlying movement processes. Every step in the acquisition and analysis pipeline might introduce artifacts or alterate trajectory features, which might bias the human interpretation or confound the automated analysis. Thus, visualization methods as well as the visualizations themselves need to take into account the corresponding factors in order to allow sound interpretation without adding or removing important trajectory features or putting a large strain on the analyst. In this paper, we provide an overview of the challenges arising in robust trajectory visualization tasks. We then discuss several methods that contribute to improved visualizations. In particular, we present practical algorithms for simplifying trajectory sets that take semantic and uncertainty information directly into account. Furthermore, we describe a complementary approach that allows to visualize the uncertainty along with the trajectories.BibTeX
D. Bienroth
et al., “Spatially resolved transcriptomics in immersive environments,”
Visual Computing for Industry, Biomedicine, and Art, vol. 5, no. 1, Art. no. 1, 2022, doi:
10.1186/s42492-021-00098-6.
Abstract
Spatially resolved transcriptomics is an emerging class of high-throughput technologies that enable biologists to systematically investigate the expression of genes along with spatial information. Upon data acquisition, one major hurdle is the subsequent interpretation and visualization of the datasets acquired. To address this challenge, VR-Cardiomicsis presented, which is a novel data visualization system with interactive functionalities designed to help biologists interpret spatially resolved transcriptomic datasets. By implementing the system in two separate immersive environments, fish tank virtual reality (FTVR) and head-mounted display virtual reality (HMD-VR), biologists can interact with the data in novel ways not previously possible, such as visually exploring the gene expression patterns of an organ, and comparing genes based on their 3D expression profiles. Further, a biologist-driven use-case is presented, in which immersive environments facilitate biologists to explore and compare the heart expression profiles of different genes.BibTeX
K. Klein, M. Sedlmair, and F. Schreiber, “Immersive Analytics: An Overview,”
it - Information Technology, vol. 64, no. 4–5, Art. no. 4–5, 2022, doi:
doi:10.1515/itit-2022-0037.
Abstract
Immersive Analytics is concerned with the systematic examination of the benefits and challenges of using immersive environments for data analysis, and the development of corresponding designs that improve the quality and efficiency of the analysis process. While immersive technologies are now broadly available, practical solutions haven’t received broad acceptance in real-world applications outside of several core areas, and proper guidelines on the design of such solutions are still under development. Both fundamental research and applications bring together topics and questions from several fields, and open a wide range of directions regarding underlying theory, evidence from user studies, and practical solutions tailored towards the requirements of application areas. We give an overview on the concepts, topics, research questions, and challenges.BibTeX
D. Garkov, C. Müller, M. Braun, D. Weiskopf, and F. Schreiber, “Research Data Curation in Visualization: Position Paper,” in
2022 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV), M. Sedlmair, Ed., in 2022 IEEE Evaluation and Beyond - Methodological Approaches for Visualization (BELIV). 2022, pp. 56–65. doi:
10.1109/BELIV57783.2022.00011.
Abstract
Research data curation is the act of carefully preparing research data and artifacts for sharing and long-term preservation. Research data management is centrally implemented and formally defined in a data management plan to enable data curation. In tandem, data curation and management facilitate research repeatability. In contrast to other research fields, data curation and management in visualization are not yet part of the researcher’s compendium. In this position paper, we discuss the unique challenges visualization faces and propose how data curation can be practically realized. We share eight lessons learned in managing data in two large research consortia, outline the larger curation workflow, and define the typical roles. We complement our lessons with minimum criteria for selecting a suitable data repository and five challenging scenarios that occur in practice. We conclude with a vision of how the visualization research community can pave the way for new curation standards.BibTeX
K. Klein
et al., “Visual analytics of sensor movement data for cheetah behaviour analysis,”
Journal of Visualization, 2021, doi:
10.1007/s12650-021-00742-6.
Abstract
Current tracking technology such as GPS data loggers allows biologists to remotely collect large amounts of movement data for a large variety of species. Extending, and often replacing interpretation based on observation, the analysis of the collected data supports research on animal behaviour, on impact factors such as climate change and human intervention on the globe, as well as on conservation programs. However, this analysis is difficult, due to the nature of the research questions and the complexity of the data sets. It requires both automated analysis, for example, for the detection of behavioural patterns, and human inspection, for example, for interpretation, inclusion of previous knowledge, and for conclusions on future actions and decision making. For this analysis and inspection, the movement data needs to be put into the context of environmental data, which helps to interpret the behaviour. Thus, a major challenge is to design and develop methods and intuitive interfaces that integrate the data for analysis by biologists. We present a concept and implementation for the visual analysis of cheetah movement data in a web-based fashion that allows usage both in the field and in office environments.BibTeX
M. Aichem
et al., “Visual exploration of large metabolic models,”
Bioinformatics, vol. 37, no. 23, Art. no. 23, May 2021, doi:
10.1093/bioinformatics/btab335.
Abstract
Large metabolic models, including genome-scale metabolic models, are nowadays common in systems biology, biotechnology and pharmacology. They typically contain thousands of metabolites and reactions and therefore methods for their automatic visualization and interactive exploration can facilitate a better understanding of these models.We developed a novel method for the visual exploration of large metabolic models and implemented it in LMME (Large Metabolic Model Explorer), an add-on for the biological network analysis tool VANTED. The underlying idea of our method is to analyze a large model as follows. Starting from a decomposition into several subsystems, relationships between these subsystems are identified and an overview is computed and visualized. From this overview, detailed subviews may be constructed and visualized in order to explore subsystems and relationships in greater detail. Decompositions may either be predefined or computed, using built-in or self-implemented methods. Realized as add-on for VANTED, LMME is embedded in a domain-specific environment, allowing for further related analysis at any stage during the exploration. We describe the method, provide a use case and discuss the strengths and weaknesses of different decomposition methods.The methods and algorithms presented here are implemented in LMME, an open-source add-on for VANTED. LMME can be downloaded from www.cls.uni-konstanz.de/software/lmme and VANTED can be downloaded from www.vanted.org. The source code of LMME is available from GitHub, at https://github.com/LSI-UniKonstanz/lmme.BibTeX
K. Klein, M. Aichem, Y. Zhang, S. Erk, B. Sommer, and F. Schreiber, “TEAMwISE : synchronised immersive environments for exploration and analysis of animal behaviour,”
Journal of Visualization, 2021, doi:
10.1007/s12650-021-00746-2.
Abstract
The recent availability of affordable and lightweight tracking sensors allows researchers to collect large and complex movement data sets. To explore and analyse these data, applications are required that are capable of handling the data while providing an environment that enables the analyst(s) to focus on the task of investigating the movement in the context of the geographic environment it occurred in. We present an extensible, open-source framework for collaborative analysis of geospatial–temporal movement data with a use case in collective behaviour analysis. The framework TEAMwISE supports the concurrent usage of several program instances, allowing to have different perspectives on the same data in collocated or remote set-ups. The implementation can be deployed in a variety of immersive environments, for example, on a tiled display wall and mobile VR devices.BibTeX
Abstract
Abstract After a long period of scepticism, more and more publications describe basic research but also practical approaches to how abstract data can be presented in immersive environments for effective and efficient data understanding. Central aspects of this important research question in immersive analytics research are concerned with the use of 3D for visualization, the embedding in the immersive space, the combination with spatial data, suitable interaction paradigms and the evaluation of use cases. We provide a characterization that facilitates the comparison and categorization of published works and present a survey of publications that gives an overview of the state of the art, current trends, and gaps and challenges in current research.BibTeX
M. Kraus, K. Klein, J. Fuchs, D. A. Keim, F. Schreiber, and M. Sedlmair, “The Value of Immersive Visualization,”
IEEE Computer Graphics and Applications (CG&A), vol. 41, no. 4, Art. no. 4, 2021, doi:
10.1109/MCG.2021.3075258.
Abstract
In recent years, research on immersive environments has experienced a new wave of interest, and immersive analytics has been established as a new research field. Every year, a vast amount of different techniques, applications, and user studies are published that focus on employing immersive environments for visualizing and analyzing data. Nevertheless, immersive analytics is still a relatively unexplored field that needs more basic research in many aspects and is still viewed with skepticism. Rightly so, because in our opinion, many researchers do not fully exploit the possibilities offered by immersive environments and, on the contrary, sometimes even overestimate the power of immersive visualizations. Although a growing body of papers has demonstrated individual advantages of immersive analytics for specific tasks and problems, the general benefit of using immersive environments for effective analytic tasks remains controversial. In this article, we reflect on when and how immersion may be appropriate for the analysis and present four guiding scenarios. We report on our experiences, discuss the landscape of assessment strategies, and point out the directions where we believe immersive visualizations have the greatest potential.BibTeX
K. Klein, D. Garkov, S. Rütschlin, T. Böttcher, and F. Schreiber, “QSDB—a graphical Quorum Sensing Database,”
Database, vol. 2021, no. 2021, Art. no. 2021, Nov. 2021, doi:
10.1093/database/baab058.
Abstract
The human microbiome is largely shaped by the chemical interactions of its microbial members, which includes cross-talk via shared signals or quenching of the signalling of other species. Quorum sensing is a process that allows microbes to coordinate their behaviour in dependence of their population density and to adjust gene expression accordingly. We present the Quorum Sensing Database (QSDB), a comprehensive database of all published sensing and quenching relations between organisms and signalling molecules of the human microbiome, as well as an interactive web interface that allows browsing the database, provides graphical depictions of sensing mechanisms as Systems Biology Graphical Notation diagrams and links to other databases.Database URL: QSDB (Quorum Sensing DataBase) is freely available via an interactive web interface and as a downloadable csv file at http://qsdb.org.BibTeX
B. Sommer
et al., “Tiled Stereoscopic 3D Display Wall - Concept, Applications and Evaluation,”
Electronic Imaging, vol. 2019, no. 3, Art. no. 3, 2019, doi:
10.2352/ISSN.2470-1173.2019.3.SDA-641.
Abstract
The Tiled Stereoscopic 3D Display Wall (TS3DW) is a monitor system consisting of six consumer 3D TVs. Two monitors reside on a mobile display mount. One standard configuration is to use them in a 135-degree angle to each other, having one mobile mount in the center, and one at each
side. In this way, the system can be transported to multiple locations across a campus as well as used in different application scenarios. This system was already used for a number of research projects and presentations.BibTeX
K. Klein, M. Aichem, B. Sommer, S. Erk, Y. Zhang, and F. Schreiber, “TEAMwISE: Synchronised Immersive Environments for Exploration and Analysis of Movement Data,” in
Proceedings of the ACM Symposium on Visual Information Communication and Interaction (VINCI), in Proceedings of the ACM Symposium on Visual Information Communication and Interaction (VINCI). ACM, 2019, pp. 9:1-9:5. doi:
10.1145/3356422.3356450.
Abstract
The recent availability of affordable and lightweight tracking sensors allows researchers to collect large and complex movement datasets. These datasets require applications that are capable of handling them whilst providing an environment that enables the analyst(s) to focus on the task of analysing the movement in the context of the geographic environment it occurred in. We present a framework for collaborative analysis of geospatial-temporal movement data with a use-case in collective behavior analysis. It supports the concurrent usage of several program instances, allowing to have different perspectives on the same data in collocated or remote setups. The implementation can be deployed in a variety of immersive environments, e.g. on a tiled display wall or mobile VR devices.BibTeX
S. Jaeger
et al., “Challenges for Brain Data Analysis in VR Environments,” in
2019 IEEE Pacific Visualization Symposium (PacificVis), in 2019 IEEE Pacific Visualization Symposium (PacificVis). 2019, pp. 42–46. doi:
10.1109/PacificVis.2019.00013.
Abstract
Analysing and understanding brain function and disorder is the main focus of neuroscience. Due to the high complexity of the brain, directionality of the signal and changing activity over time, visual exploration and data analysis are difficult. For this reason, a vast amount of research challenges are still unsolved. We explored different challenges of the visual analysis of brain data and the design of corresponding immersive environments in collaboration with experts from the biomedical domain. We built a prototype of an immersive virtual reality environment to explore the design space and to investigate how brain data analysis can be supported by a variety of design choices. Our environment can be used to study the effect of different visualisations and combinations of brain data representation, as for example network layouts, anatomical mapping or time series. As a long-term goal, we aim to aid neuro-scientists in a better understanding of brain function and disorder.BibTeX
K. Klein
et al., “Fly with the flock : immersive solutions for animal movement visualization and analytics,”
Journal of the Royal Society Interface, vol. 16, no. 153, Art. no. 153, 2019, doi:
10.1098/rsif.2018.0794.
Abstract
Understanding the movement of animals is important for a wide range of scientific interests including migration, disease spread, collective movement behaviour and analysing motion in relation to dynamic changes of the environment such as wind and thermal lifts. Particularly, the three-dimensional (3D) spatial–temporal nature of bird movement data, which is widely available with high temporal and spatial resolution at large volumes, presents a natural option to explore the potential of immersive analytics (IA). We investigate the requirements and benefits of a wide range of immersive environments for explorative visualization and analytics of 3D movement data, in particular regarding design considerations for such 3D immersive environments, and present prototypes for IA solutions. Tailored to biologists studying bird movement data, the immersive solutions enable geo-locational time-series data to be investigated interactively, thus enabling experts to visually explore interesting angles of a flock and its behaviour in the context of the environment. The 3D virtual world presents the audience with engaging and interactive content, allowing users to ‘fly with the flock’, with the potential to ascertain an intuitive overview of often complex datasets, and to provide the opportunity thereby to formulate and at least qualitatively assess hypotheses. This work also contributes to ongoing research efforts to promote better understanding of bird migration and the associated environmental factors at the global scale, thereby providing a visual vehicle for driving public awareness of environmental issues and bird migration patterns.BibTeX
Abstract
Recent advances in tracking technology allow biologists to collect large amounts of movement data for a variety of species. Analysis of the collected data supports research on animal behaviour, influence of impact factors such as climate change and human intervention, as well as conservation programs. Analysis of the movement data is difficult, due to the nature of the research questions and the complexity of the data sets. It requires both automated analysis, e.g. for the detection of behavioural patterns, and human inspection, e.g. for interpretation, inclusion of previous knowledge, and for conclusions on future actions and decision making. We present a concept and implementation for the visual analysis of cheetah movement data in a web-based fashion that allows usage both in the field and in office environments.BibTeX
V. Yoghourdjian, T. Dwyer, K. Klein, K. Marriott, and M. Wybrow, “Graph Thumbnails: Identifying and Comparing Multiple Graphs at a Glance,”
IEEE Transactions on Visualization and Computer Graphics, vol. 24, no. 12, Art. no. 12, 2018, doi:
10.1109/TVCG.2018.2790961.
Abstract
We propose Graph Thumbnails, small icon-like visualisations of the high-level structure of network data. Graph Thumbnails are designed to be legible in small multiples to support rapid browsing within large graph corpora. Compared to existing graph-visualisation techniques our representation has several advantages: (1) the visualisation can be computed in linear time; (2) it is canonical in the sense that isomorphic graphs will always have identical thumbnails; and (3) it provides precise information about the graph structure. We report the results of two user studies. The first study compares Graph Thumbnails to node-link and matrix views for identifying similar graphs. The second study investigates the comprehensibility of the different representations. We demonstrate the usefulness of this representation for summarising the evolution of protein-protein interaction networks across a range of species.BibTeX
M. Klapperstueck
et al., “Contextuwall: Multi-site Collaboration Using Display Walls,”
Journal of Visual Languages & Computing, vol. 46, pp. 35–42, 2018, doi:
10.1016/j.jvlc.2017.10.002.
Abstract
The emerging field of Immersive Analytics investigates how novel display and interaction technologies can enable people to better explore and analyse data and complex information. Collaboration is a crucial aspect of Immersive Analytics. In this paper we present ContextuWall, a system for interactive local and remote collaboration using touch and mobile devices as well as displays of various sizes. The system enables groups of users located on different sites to share content to a jointly used virtual desktop which is accessible over a secured network. This virtual desktop can be shown on different large displays simultaneously, taking advantage of their high resolution. To enable users to intuitively share, arrange as well as annotate image content, a purpose-built client software has been developed and can easily be adapted with plug-ins for existing data analytics software. We show exemplary use cases and describe the system architecture and its implementation.BibTeX
K. Marriott
et al.,
Immersive Analytics, vol. 11190. in Lecture Notes in Computer Science (LNCS), vol. 11190. Springer International Publishing, 2018. doi:
10.1007/978-3-030-01388-2.
Abstract
Immersive Analytics is a new research initiative that aims to remove barriers between people, their data and the tools they use for analysis and decision making. Here the aims of immersive analytics research are clarified, its opportunities and historical context, as well as providing a broad research agenda for the field. In addition, it is reviewed how the term immersion has been used to refer to both technological and psychological immersion, both of which are central to immersive analytics research.BibTeX
Y. Zhu
et al., “Genome-scale Metabolic Modeling of Responses to Polymyxins in Pseudomonas Aeruginosa,”
GigaScience, vol. 7, no. 4, Art. no. 4, 2018, doi:
10.1093/gigascience/giy021.
Abstract
Background
Pseudomonas aeruginosa often causes multidrug-resistant infections in immunocompromised patients, and polymyxins are often used as the last-line therapy. Alarmingly, resistance to polymyxins has been increasingly reported worldwide recently. To rescue this last-resort class of antibiotics, it is necessary to systematically understand how P. aeruginosa alters its metabolism in response to polymyxin treatment, thereby facilitating the development of effective therapies. To this end, a genome-scale metabolic model (GSMM) was used to analyze bacterial metabolic changes at the systems level.
Findings
A high-quality GSMM iPAO1 was constructed for P. aeruginosa PAO1 for antimicrobial pharmacological research. Model iPAO1 encompasses an additional periplasmic compartment and contains 3022 metabolites, 4265 reactions, and 1458 genes in total. Growth prediction on 190 carbon and 95 nitrogen sources achieved an accuracy of 89.1%, outperforming all reported P. aeruginosa models. Notably, prediction of the essential genes for growth achieved a high accuracy of 87.9%. Metabolic simulation showed that lipid A modifications associated with polymyxin resistance exert a limited impact on bacterial growth and metabolism but remarkably change the physiochemical properties of the outer membrane. Modeling with transcriptomics constraints revealed a broad range of metabolic responses to polymyxin treatment, including reduced biomass synthesis, upregulated amino acid catabolism, induced flux through the tricarboxylic acid cycle, and increased redox turnover.
Conclusions
Overall, iPAO1 represents the most comprehensive GSMM constructed to date for Pseudomonas. It provides a powerful systems pharmacology platform for the elucidation of complex killing mechanisms of antibiotics.BibTeX
M. Ghaffar
et al., “3D Modelling and Visualisation of Heterogeneous Cell Membranes in Blender,” in
Proceedings of the 11th International Symposium on Visual Information Communication and Interaction, in Proceedings of the 11th International Symposium on Visual Information Communication and Interaction. Växjö, Sweden: Association for Computing Machinery, 2018, pp. 64–71. doi:
10.1145/3231622.3231639.
Abstract
Chlamydomonas reinhardtii cells have been in the focus of research for more than a decade, in particular due to its use as alternative source for energy production. However, the molecular processes in these cells are still not completely known, and 3D visualisations may help to understand these complex interactions and processes. In previous work, we presented the stereoscopic 3D (S3D) visualisation of a complete Chlamydomonas reinhardtii cell created with the 3D modelling framework Blender. This animation contained already a scene showing an illustrative membrane model of the thylakoid membrane. During discussion with domain experts, shortcomings of the visualisation for several detailed analysis questions have been identified and it was decided to redefine it.A new modelling and visualisation pipeline based on a Membrane Packing Algorithm was developed, which can be configured via a user interface, enabling the composition of membranes employing published material. An expert user study was conducted to evaluate this new approach, with half the participants having a biology and the other half having an informatics background. The new and old Chlamydomonas thylakoid membrane models were presented on a S3D back projection system. The evaluation results reveal that the majority of participants preferred the new, more realistic membrane visualisation. However, the opinion varied with the expertise, leading to valuable conclusions for future visualisations. Interestingly, the S3D presentation of molecular structures lead to a positive change in opinion regarding S3D technology.BibTeX
M. de Ridder, K. Klein, and J. Kim, “A Review and Outlook on Visual Analytics for Uncertainties in Functional Magnetic Resonance Imaging,”
Brain Informatics, vol. 5, no. 2, Art. no. 2, 2018, doi:
10.1186/s40708-018-0083-0.
Abstract
Analysis of functional magnetic resonance imaging (fMRI) plays a pivotal role in uncovering an understanding of the brain. fMRI data contain both spatial volume and temporal signal information, which provide a depiction of brain activity. The analysis pipeline, however, is hampered by numerous uncertainties in many of the steps; often seen as one of the last hurdles for the domain. In this review, we categorise fMRI research into three pipeline phases: (i) image acquisition and processing; (ii) image analysis; and (iii) visualisation and human interpretation, to explore the uncertainties that arise in each phase, including the compound effects due to the inter-dependence of steps. Attempts at mitigating uncertainties rely on providing interactive visual analytics that aid users in understanding the effects of the uncertainties and adjusting their analyses. This impetus for visual analytics comes in light of considerable research investigating uncertainty throughout the pipeline. However, to the best of our knowledge, there is yet to be a comprehensive review on the importance and utility of uncertainty visual analytics (UVA) in addressing fMRI concerns, which we term fMRI-UVA. Such techniques have been broadly implemented in related biomedical fields, and its potential for fMRI has recently been explored; however, these attempts are limited in their scope and utility, primarily focussing on addressing small parts of single pipeline phases. Our comprehensive review of the fMRI uncertainties from the perspective of visual analytics addresses the three identified phases in the pipeline. We also discuss the two interrelated approaches for future research opportunities for fMRI-UVA.BibTeX
M. de Ridder, K. Klein, and J. Kim, “Temporaltracks: Visual Analytics for Exploration of 4D fMRI Time-series Coactivation,” in
Proceedings of the Computer Graphics International Conference (CGI), X. Mao, D. Thalmann, and M. L. Gavrilova, Eds., in Proceedings of the Computer Graphics International Conference (CGI). ACM, 2017, pp. 13:1-13:6. doi:
10.1145/3095140.3095153.
Abstract
Functional magnetic resonance imaging (fMRI) is a 4D medical imaging modality that depicts a proxy of neuronal activity in a series of temporal scans. Statistical processing of the modality shows promise in uncovering insights about the functioning of the brain, such as the default mode network, and characteristics of mental disorders. Current statistical processing generally summarises the temporal signals between brain regions into a single data point to represent the 'coactivation' of the regions. That is, how similar are their temporal patterns over the scans. However, the potential of such processing is limited by issues of possible data misrepresentation due to uncertainties, e.g. noise in the data. Moreover, it has been shown that brain signals are characterised by brief traces of coactivation, which are lost in the single value representations. To alleviate the issues, alternate statistical processes have been used, however creating effective techniques has proven difficult due to problems, e.g. issues with noise, which often require user input to uncover. Visual analytics, therefore, through its ability to interactively exploit human expertise, presents itself as an interesting approach of benefit to the domain. In this work, we present the conceptual design behind TemporalTracks, our visual analytics system for exploration of 4D fMRI time-series coactivation data, utilising a visual metaphor to effectively present coactivation data for easier understanding. We describe our design with a case study visually analysing Human Connectome Project data, demonstrating that TemporalTracks can uncover temporal events that would otherwise be hidden in standard analysis.BibTeX
H. T. Nim
et al., “Design Considerations for Immersive Analytics of Bird Movements Obtained by
Miniaturised GPS Sensors,” in
Proceedings of the Eurographics Workshop on Visual Computing for Biology and Medicine (VCBM), in Proceedings of the Eurographics Workshop on Visual Computing for Biology and Medicine (VCBM). Eurographics Association, 2017. doi:
10.2312/vcbm.20171234.
Abstract
Recent advances in miniaturising sensor tags allow to obtain high-resolution bird trajectories, presenting an opportunity for immersive close-up observation of individual and group behaviour in mid-air. The combination of geographical, environmental, and movement data is well suited for investigation in immersive analytics environments. We explore the benefits and requirements of a wide range of such environments, and illustrate a multi-platform immersive analytics solution, based on a tiled 3D display wall and head-mounted displays (Google Cardboard, HTC Vive and Microsoft Hololens). Tailored to biologists studying bird movement data, the immersive environment provides a novel interactive mode to explore the geolocational time-series data. This paper aims to inform the 3D visualisation research community about design considerations obtained from a real world data set in different 3D immersive environments. This work also contributes to ongoing research efforts to promote better understanding of bird migration and the associated environmental factors at the planet-level scale, thereby capturing the public awareness of environmental issues.BibTeX
T. Chandler
et al., “Immersive Analytics,” in
Proceedings of the IEEE Symposium on Big Data Visual Analytics (BDVA), in Proceedings of the IEEE Symposium on Big Data Visual Analytics (BDVA). IEEE, 2015, pp. 73–80. doi:
10.1109/BDVA.2015.7314296.
Abstract
Immersive Analytics is an emerging research thrust investigating how new interaction and display technologies can be used to support analytical reasoning and decision making. The aim is to provide multi-sensory interfaces that support collaboration and allow users to immerse themselves in their data in a way that supports real-world analytics tasks. Immersive Analytics builds on technologies such as large touch surfaces, immersive virtual and augmented reality environments, sensor devices and other, rapidly evolving, natural user interface devices. While there is a great deal of past and current work on improving the display technologies themselves, our focus in this position paper is on bringing attention to the higher-level usability and design issues in creating effective user interfaces for data analytics in immersive environments.BibTeX