Semantic-aware blind image quality assessment

Ernestasia Siahaan, Alan Hanjalic, Judith A. Redi

Research output: Contribution to journalArticleScientificpeer-review

22 Citations (Scopus)

Abstract

Many studies have indicated that predicting users’ perception of visual quality depends on various factors other than artifact visibility alone, such as viewing environment, social context, or user personality. Exploiting information on these factors, when applicable, can improve users’ quality of experience while saving resources. In this paper, we improve the performance of existing no-reference image quality metrics (NR-IQM) using image semantic information (scene and object categories), building on our previous findings that image scene and object categories influence user judgment of visual quality. We show that adding scene category features, object category features, or the combination of both to perceptual quality features results in significantly higher correlation with user judgment of visual quality. We also contribute a new publicly available image quality dataset which provides subjective scores on images that cover a wide range of scene and object category evenly. As most public image quality datasets so far span limited semantic categories, this new dataset opens new possibilities to further explore image semantics and quality of experience.
Original languageEnglish
Pages (from-to)237-252
Number of pages16
JournalSignal Processing: Image Communication
Volume60
DOIs
Publication statusPublished - 2018

Keywords

  • Blind image quality assessment
  • No-reference image quality metrics (NR-IQM)
  • Quality of Experience (QoE)
  • Image semantics
  • Subjective quality datasets

Fingerprint

Dive into the research topics of 'Semantic-aware blind image quality assessment'. Together they form a unique fingerprint.

Cite this