INF | Collaboration Infrastructure

Prof. Thomas Ertl, University of Stuttgart
Email | Website

Prof. Falk Schreiber, University of Konstanz
Email | Website

Thomas Ertl
Falk Schreiber

There is no advisor for this project.

Dimitar Garkov, University of Konstanz – Email | Website

In the first funding period, Project INF supported the other projects of the SFB-TRR 161 by providing a central approach to data management and an infrastructure for virtual meetings in a large, high-resolution display scenario.
The project will continue to do so in the upcoming funding period by employing the existing hardware and the software infrastructure which was developed in the project. It will also continue to manage accounts and access rights as necessary. Project INF provides the other projects with guidelines on how to use the research data management system. Since there are many projects in the SFB-TRR 161 which conduct user studies—and therefore handle personal data—Project INF will make sure, that all projects provide a GDPR-compliant documentation of processing activities and processes for protecting personal data.

A dedicated tele-conferencing solution developed during the last funding period was, and is being regularly used in the lecture series and other invited talks. To include researchers at other sites (e.g. Ulm and Munich), who are not able to join the current solution, an RTSP server will be added to the tele-conferencing solution. With this researchers will be able to follow the lecture series and other talks using any standard media player.

An increasing number of research groups in the SFB-TRR 161 are conducting research related to the overall topic of immersion, including a broad range of aspects such as visualization, rendering, interaction, and algorithmic implications. In order to support collaboration, Project INF plans to develop and deploy a new infrastructure that allows research to be conducted not only for, but also in virtual-reality (VR), and augmented-reality (AR) environments, including intermediate forms of mixed-reality (MR). This will strongly improve the opportunities for collaborative research between projects of the SFB-TRR 161, and at the same time foster immersive analytics research, remote collaborative work, and data analysis in immersive environments. The development and deployment of such shared infrastructure will reduce the need for coordination and travel, allow other projects to set up similar and comparable immersive environments without much effort, ease issues of equipment access, as well as directly influence designs and studies for collaborative immersive analytics research. In addition to these direct impacts, research results can then be more easily presented, and also various impressive outreach activities are conceivable. To support natural collaborative interaction, depending on the type of environment, technical solutions that track people, gestures, and devices, and allow us to record and stream voice and camera data will be applied.

Fig. 1: Talk at the Powerwall at the University of Konstanz

The RTSP streaming solution for the tele-conferencing solution has been implemented and is now regularly used for the lecture series and other invited talks.


  1. D. Bienroth et al., “Spatially resolved transcriptomics in immersive environments,” Visual Computing for Industry, Biomedicine, and Art, vol. 5, no. 1, Art. no. 1, 2022, doi: 10.1186/s42492-021-00098-6.
  2. F. Schreiber and D. Weiskopf, “Quantitative Visual Computing,” it - Information Technology, vol. 64, no. 4–5, Art. no. 4–5, 2022, doi: doi:10.1515/itit-2022-0048.
  3. F. Frieß, M. Becher, G. Reina, and T. Ertl, “Amortised Encoding for Large High-Resolution Displays,” in 2021 IEEE 11th Symposium on Large Data Analysis and Visualization (LDAV), 2021, pp. 53–62. doi: 10.1109/LDAV53230.2021.00013.
  4. K. Klein, D. Garkov, S. Rütschlin, T. Böttcher, and F. Schreiber, “QSDB—a graphical Quorum Sensing Database,” Database, vol. 2021, no. 2021, Art. no. 2021, Nov. 2021, doi: 10.1093/database/baab058.
  5. V. Bruder, C. Müller, S. Frey, and T. Ertl, “On Evaluating Runtime Performance of Interactive Visualizations,” IEEE Transactions on Visualization and Computer Graphics, vol. 26, pp. 2848–2862, Sep. 2020, doi: 10.1109/TVCG.2019.2898435.
  6. F. Frieß, M. Braun, V. Bruder, S. Frey, G. Reina, and T. Ertl, “Foveated Encoding for Large High-Resolution Displays,” IEEE Transactions on Visualization and Computer Graphics, vol. 27, no. 2, Art. no. 2, 2020, doi: 10.1109/TVCG.2020.3030445.
  7. F. Frieß, C. Müller, and T. Ertl, “Real-Time High-Resolution Visualisation,” in Proceedings of the Eurographics Symposium on Vision, Modeling, and Visualization (VMV), 2020, pp. 127–135. doi: 10.2312/vmv.20201195.
  8. C. Müller, M. Braun, and T. Ertl, “Optimised Molecular Graphics on the HoloLens,” in IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019, Osaka, Japan, March 23-27, 2019, 2019, pp. 97–102. doi: 10.1109/VR.2019.8798111.
  9. F. Frieß, M. Landwehr, V. Bruder, S. Frey, and T. Ertl, “Adaptive Encoder Settings for Interactive Remote Visualisation on High-Resolution Displays,” in Proceedings of the IEEE Symposium on Large Data Analysis and Visualization - Short Papers (LDAV), 2018, pp. 87–91. doi: 10.1109/LDAV.2018.8739215.