A05 | Image/Video Quality Assessment: From Test Databases to Similarity-Aware and Perceptual Dynamic Metrics

Prof. Dietmar Saupe, Universität Konstanz
Email | Website

Dietmar Saupe

Jun.-Prof. Martin Fuchs, Universität Stuttgart
Email | Website

Martin Fuchs

Vlad Hosu, Universität Konstanz – Email | Website

Franz Hahn, Universität Konstanz – Email | Website 

The project addresses methods for automated visual quality assessment and their validation beyond mean opinion scores. We propose to enhance the methods by including similarity awareness and predicted eye movement sequences, quantifying the perceptual viewing experience, and to apply the metrics for quality-aware media processing. Moreover, we will set up and apply media databases that are diverse in content and authentic in the distortions, in contrast to current scientific data sets.

Research Questions

How can crowdsourcing be applied to help generating very large video media data bases for research applications in quality of multimedia?

What is the performance of state-of-the-art video quality assessment methods that were designed based on small training sets for such large and diversified media databases?

Quality assessment in such extremely large empirical studies requires crowdsourcing. How should that be organized to achieve sufficient reliability and efficiency?

Are machine learning techniques suitable to identify the best performing video quality assessment metrics for given media content?

What statistical/perceptual features should be extracted to express similarity for this task? 

How can one design new or hybrid strategies for video quality assessment based on the above?

Can we improve methods for image/video quality assessment by studying patterns of human visual attention and other perceptual aspects?

How can knowledge on human visual attention derived from eyetracking studies be incorporated into perceptual image/video quality assessment methods?

How can the quality assessment methods be applied in quality-aware media processing such as perceptual coding?

Training Better Algorithms to Predict Subjective Quality Opinions.

Saliency Driven Compression.


  1. M. Spicker, F. Hahn, T. Lindemeier, D. Saupe, and O. Deussen, “Quantifying Visual Abstraction Quality for Stipple Drawings,” in Proceedings of NPAR’17, 2017.
  2. V. Hosu, F. Hahn, M. Jenadeleh, H. M. Lin, S. H., S. T., Li, and D. Saupe, “The Konstanz natural video database (KoNViD-1k),” in 9th International Conference on Quality of Multimedia Experience (QoMEX), 2017.
  3. V. Hosu, F. Hahn, O. Wiedemann, S.-H. Jung, and D. Saupe, “Saliency-driven image coding improves overall perceived JPEG quality,” in Picture Coding Symposium (PCS), 2016.
  4. I. Zingman, D. Saupe, O. Penatti, and K. Lambers, “Detection of Fragmented Rectangular Enclosures in Very High Resolution Remote Sensing Images,” 2016.
  5. D. Saupe, F. Hahn, V. Hosu, I. Zingman, M. Rana, and S. Li, “Crowd workers proven useful: A comparative study of subjective video quality assessment,” 8th International Conference on Quality of Multimedia Experience (QoMEX 2016), Lisbon, Portugal, 2016.
  6. V. Hosu, F. Hahn, I. Zingman, and D. Saupe, “Reported Attention as a Promising Alternative to Gaze in IQA Tasks,” 5th International Workshop on Perceptual Quality of Systems 2016 (PQS 2016), Berlin, 2016.