INF | Collaboration Infrastructure

Prof. Thomas Ertl, University of Stuttgart
Email | Website

Prof. Falk Schreiber, University of Konstanz
Email | Website

Thomas Ertl
Falk Schreiber

There is no advisor for this project.

Florian Frieß, University of Stuttgart – Email | Website

Dimitar Garkov, University of Konstanz – Email | Website

In the first funding period, Project INF supported the other projects of the SFB-TRR 161 by providing a central approach to data management and an infrastructure for virtual meetings in a large, high-resolution display scenario.
The project will continue to do so in the upcoming funding period by employing the existing hardware and the software infrastructure which was developed in the project. It will also continue to manage accounts and access rights as necessary. Project INF provides the other projects with guidelines on how to use the research data management system. Since there are many projects in the SFB-TRR 161 which conduct user studies—and therefore handle personal data—Project INF will make sure, that all projects provide a GDPR-compliant documentation of processing activities and processes for protecting personal data.

A dedicated tele-conferencing solution developed during the last funding period was, and is being regularly used in the lecture series and other invited talks. To include researchers at other sites (e.g. Ulm and Munich), who are not able to join the current solution, an RTSP server will be added to the tele-conferencing solution. With this researchers will be able to follow the lecture series and other talks using any standard media player.

An increasing number of research groups in the SFB-TRR 161 are conducting research related to the overall topic of immersion, including a broad range of aspects such as visualization, rendering, interaction, and algorithmic implications. In order to support collaboration, Project INF plans to develop and deploy a new infrastructure that allows research to be conducted not only for, but also in virtual-reality (VR), and augmented-reality (AR) environments, including intermediate forms of mixed-reality (MR). This will strongly improve the opportunities for collaborative research between projects of the SFB-TRR 161, and at the same time foster immersive analytics research, remote collaborative work, and data analysis in immersive environments. The development and deployment of such shared infrastructure will reduce the need for coordination and travel, allow other projects to set up similar and comparable immersive environments without much effort, ease issues of equipment access, as well as directly influence designs and studies for collaborative immersive analytics research. In addition to these direct impacts, research results can then be more easily presented, and also various impressive outreach activities are conceivable. To support natural collaborative interaction, depending on the type of environment, technical solutions that track people, gestures, and devices, and allow us to record and stream voice and camera data will be applied.

Fig. 1: Talk at the Powerwall at the University of Konstanz

The RTSP streaming solution for the tele-conferencing solution has been implemented and is now regularly used for the lecture series and other invited talks.

Publications

  1. V. Bruder, C. Müller, S. Frey, and T. Ertl, “On Evaluating Runtime Performance of Interactive Visualizations,” IEEE Transactions on Visualization and Computer Graphics, vol. 26, pp. 2848–2862, 2020, doi: 10.1109/TVCG.2019.2898435.
  2. F. Frieß, C. Müller, and T. Ertl, “Real-Time High-Resolution Visualisation,” in Proceedings of the Eurographics Symposium on Vision, Modeling, and Visualization (VMV), 2020, pp. 127–135, doi: 10.2312/vmv.20201195.
  3. F. Frieß, M. Landwehr, V. Bruder, S. Frey, and T. Ertl, “Adaptive Encoder Settings for Interactive Remote Visualisation on High-Resolution Displays,” in Proceedings of the IEEE Symposium on Large Data Analysis and Visualization - Short Papers (LDAV), 2018, pp. 87–91, doi: 10.1109/LDAV.2018.8739215.