1. J. Zagermann, U. Pfeil, D. Fink, P. von Bauer, and H. Reiterer, “Memory in Motion: The Influence of Gesture- and Touch-Based Input Modalities on Spatial Memory,” 2017.
  2. D. Jäckle, F. Stoffel, S. Mittelstädt, D. A. Keim, and H. Reiterer, “Interpretation of Dimensionally-Reduced Crime Data: A Study with Untrained Domain Experts.,” 2017, no. 2017, pp. 164–175.
  3. A. Nesti, K. de Winkel, and H. Bülthoff, “Accumulation of inertial sensory information in the perception of whole body yaw rotation,” One, Plos, 2017.
  4. C. Schulz, A. Nocaj, J. Goertler, O. Deussen, U. Brandes, and D. Weiskopf, “Probabilistic Graph Layout for Uncertain Network Visualization,” vol. 23, no. 1, 2017.
  5. M. Krone, F. Frieß, K. Scharnowski, G. Reina, S. Fademrecht, T. Kulschewski, J. Pleiss, and T. Ertl, “Molecular Surface Maps,” in IEEE Transactions on Visualization and Computer Graphics (Proceedings of the Scientific Visualization 2016), 2017, vol. 23, no. 1.
  6. P. Tutzauer, S. Becker, D. Fritsch, T. Niese, and O. Deussen, “A Study of the Human Comprehension of Building Categories Based on Different 3D Building Representations,” Photogrammetrie - Fernerkundung - Geoinformation, vol. 2016, no. 5–6, pp. 319–333(15), 2016.
  7. L. Lischke, S. Mayer, K. Wolf, N. Henze, H. Reiterer, and A. Schmidt, “Screen arrangements and interaction areas for large display work places.,” vol. Proceedings of the 5th ACM International Symposium on Pervasive Displays PerDis 2016, pp. 228–234, 2016.
  8. J. Zagermann, U. Pfeil, and H. Reiterer, “Measuring Cognitive Load using Eye Tracking Technology in Visual Computing,” 2016, vol. Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization (BELIV 2016), pp. 78–85.
  9. J. Hildenbrand, A. Nocaj, and U. Brandes, “Flexible Level-of-Detail Rendering for Large Graphs,” no. 9801 2016, G. Drawing and 24th International Symposium Network Visualization, Eds. 2016.
  10. A. Hautli-Janisz and V. Lyding, VisLR II: Visualization as Added Value in the Development, Use and Evaluation of Language Resources. 2016.
  11. M. Hund, D. Böhm, W. Sturm, M. Sedlmair, T. Schreck, T. Ullrich, D. A. Keim, L. Majnaric, and A. Holzinger, “Visual analytics for concept exploration in subspaces of patient groups.,” Brain Informatics, vol. 3, no. 4, pp. 233–247, 2016.
  12. M. Behrisch, B. Bach, M. Hund, L. von Rüden, M. Delz, J.-D. Fekete, and T. Schreck, “Magnostics: Image-based Search of Interesting Matrix Views for Guided Network Exploration.,” 2016, no. 1–1, p. 99.
  13. O. Johannsen, A. Sulc, N. Marniok, and B. Goldluecke, “Layered scene reconstruction from multiple light field camera views,” 2016.
  14. I. Zingman, D. Saupe, O. Penatti, and K. Lambers, “Detection of Fragmented Rectangular Enclosures in Very High Resolution Remote Sensing Images,” 2016.
  15. D. Weiskopf, M. Burch, L. L. Chuang, B. Fischer, and A. Schmidt, Eye Tracking and Visualization: Foundations, Techniques, and Applications. Berlin, Heidelberg: Springer, 2016.
  16. N. Flad, J. Ditz, H. H. Bülthoff, and L. L. Chuang, “Data-driven approaches to unrestricted gaze-tracking benefit from saccade filtering,” Second Workshop on Eye Tracking and Visualization, IEEE Visualization 2016, 2016.
  17. A. Nocaj, M. Ortmann, and U. Brandes, “Adaptive Disentanglement based on Local Clustering in Small-World Network Visualization,” IEEE Transactions on Visualization and Computer Graphics, vol. 22, no. 6, pp. 1662–1671, 2016.
  18. D. Saupe, F. Hahn, V. Hosu, I. Zingman, M. Rana, and S. Li, “Crowd workers proven useful: A comparative study of subjective video quality assessment,” 8th International Conference on Quality of Multimedia Experience (QoMEX 2016), Lisbon, Portugal, 2016.
  19. V. Hosu, F. Hahn, I. Zingman, and D. Saupe, “Reported Attention as a Promising Alternative to Gaze in IQA Tasks,” 5th International Workshop on Perceptual Quality of Systems 2016 (PQS 2016), Berlin, 2016.
  20. P. Tutzauer, S. Becker, T. Niese, O. Deussen, and D. Fritsch, “Understanding Human Perception of Building Categories in Virtual 3d Cities - a User Study,” ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XLI–B2, pp. 683–687, 2016.
  21. K. Kurzhals, M. Hlawatsch, M. Burch, and D. Weiskopf, “Fixation-Image Charts,” in ETRA, 2016, no. 1, pp. 11–18.
  22. C. Schulz, A. Nocaj, M. El-Assady, S. Frey, M. Hlawatsch, M. Hund, G. K. Karch, R. Netzel, C. Schätzle, M. Butt, D. Keim, T. Ertl, U. Brandes, and D. Weiskopf, “To appear: Generative Data Models for Validation and Evaluation of Visualization Techniques,” 2016.
  23. R. Netzel, M. Burch, and D. Weiskopf, “Interactive Scanpath-Oriented Annotation of Fixations,” 2016, pp. 183–187.
  24. M. Burch, R. Woods, R. Netzel, and D. Weiskopf, “The Challenges of Designing Metro Maps,” 2016, pp. 195–202.
  25. T. Blascheck, F. Beck, S. Baltes, T. Ertl, and D. Weiskopf, “To appear: Visual Analysis and Coding of Data-Rich User Behavior,” 2016.
  26. C. Schulz, M. Burch, F. Beck, and D. Weiskopf, “To appear: Visual Data Cleansing of Low-Level Eye Tracking Data,” in Extended Papers of ETVIS 2015, 2016.
  27. M. Hund, I. Färber, M. Behrisch, A. Tatu, T. Schreck, D. A. Keim, and T. Seidl, “Visual Quality Assessment of Subspace Clusterings,” 2016.
  28. J. Müller, R. Rädle, and H. Reiterer, Virtual Objects as Spatial Cues in Collaborative Mixed Reality Environments: How They Shape Communication Behavior and User Task Load. ACM, 2016.
  29. J. Zagermann, U. Pfeil, R. Rädle, H.-C. Jetter, C. Klokmose, and H. Reiterer, “When Tablets meet Tabletops: The Effect of Tabletop Size on Around-the-Table Collaboration with Personal Tablets.,” 2016.
  30. M. Scheer, H. H. Bülthoff, and L. L. Chuang, “Steering Demands Diminish the Early-P3, Late-P3 and RON Components of the Event-Related Potential of Task-Irrelevant Environmental Sounds,” 2016, vol. 10, no. 73.
  31. C. L. L. and B. H. H., “Towards a Better Understanding of Gaze Behavior in the Automobile.,” in Workshop on Practical Experiences in Measuring and Modeling Drivers and Driver-Vehicle Interactions In conjunction with AutomotiveUI 2015, 2015.
  32. N. Flad, T. Fomina, H. H. Bülthoff, and L. L. Chuang, “In press: Unsupervised clustering of EOG as a viable substitute for optical eye-tracking,” First Workshop on Eye Tracking and Visualization at IEEE Visualization, 2015.
  33. L. L. Chuang, “Error visualization and information-seeking behavior for air-vehicle control.,” Foundations of Augmented Cognition. Lecture Notes in Artificial Intelligence, vol. 9183, pp. 3–11, 2015.
  34. C. Schulz, M. Burch, and D. Weiskopf, Visual Data Cleansing of Eye Tracking Data. 2015.
  35. K. Kurzhals, B. Fisher, M. Burch, and D. Weiskopf, “Eye Tracking Evaluation of Visual Analytics,” 2015.
  36. M. Spicker, J. Kratt, D. Arellano, and O. Deussen, Depth-Aware Coherent Line Drawings. ACM, 2015.
  37. M. Hund, M. Behrisch, I. Farber, M. Sedlmair, T. Schreck, T. Seidl, and D. Keim, “Subspace Nearest Neighbor Search - Problem Statement, Approaches, and Discussion,” 2015.
  38. K. Kurzhals, M. Burch, T. Pfeiffer, and D. Weiskopf, “Eye Tracking in Computer-Based Visualization,” Computing in Science and Engineering, vol. 17, no. 5, pp. 64–71, 2015.
  39. K. Kurzhals, M. Hlawatsch, F. Heimerl, M. Burch, T. Ertl, and D. Weiskopf, “Gaze Stripes: Image-Based Visualization of Eye Tracking Data,” IEEE Xplore Digital Library, 2015.