Skip to main content
Log in

Perceptual image quality assessment: a survey

  • Review
  • Published:
Science China Information Sciences Aims and scope Submit manuscript

Abstract

Perceptual quality assessment plays a vital role in the visual communication systems owing to the existence of quality degradations introduced in various stages of visual signal acquisition, compression, transmission and display. Quality assessment for visual signals can be performed subjectively and objectively, and objective quality assessment is usually preferred owing to its high efficiency and easy deployment. A large number of subjective and objective visual quality assessment studies have been conducted during recent years. In this survey, we give an up-to-date and comprehensive review of these studies. Specifically, the frequently used subjective image quality assessment databases are first reviewed, as they serve as the validation set for the objective measures. Second, the objective image quality assessment measures are classified and reviewed according to the applications and the methodologies utilized in the quality measures. Third, the performances of the state-of-the-art quality measures for visual signals are compared with an introduction of the evaluation protocols. This survey provides a general overview of classical algorithms and recent progresses in the field of perceptual image quality assessment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Wang Z, Bovik A C. Mean squared error: love it or leave it? a new look at signal fidelity measures. IEEE Signal Process Mag, 2009, 26: 98–117

    Google Scholar 

  2. Wang Z, Bovik A C. Reduced- and no-reference image quality assessment. IEEE Signal Process Mag, 2011, 28: 29–40

    Google Scholar 

  3. Lin W, Kuo C C J. Perceptual visual quality metrics: a survey. J Visual Commun Image Represent, 2011, 22: 297–312

    Google Scholar 

  4. Moorthy A K, Bovik A C. Visual quality assessment algorithms: what does the future hold? Mult Tools Appl, 2011, 51: 675–696

    Google Scholar 

  5. Chandler D M. Seven challenges in image quality assessment: past, present, and future research. ISRN Signal Process, 2013, 2013

    Google Scholar 

  6. He L, Gao F, Hou W, et al. Objective image quality assessment: a survey. Int J Comput Math, 2014, 91: 2374–2388

    MathSciNet  MATH  Google Scholar 

  7. Mohammadi P, Ebrahimi-Moghadam A, Shirani S. Subjective and objective quality assessment of image: a survey. 2014. ArXiv: 14067799

    Google Scholar 

  8. Manap R A, Shao L. Non-distortion-specific no-reference image quality assessment: a survey. Inf Sci, 2015, 301: 141–160

    Google Scholar 

  9. Xu S, Jiang S, Min W. No-reference/blind image quality assessment: a survey. IETE Tech Rev, 2017, 34: 223–245

    Google Scholar 

  10. Winkler S. Analysis of public image and video databases for quality assessment. IEEE J Sel Top Signal Process, 2012, 6: 616–625

    Google Scholar 

  11. Winkler S, Subramanian R. Overview of eye tracking datasets. In: Proceedings of IEEE International Workshop on Quality of Multimedia Experience, 2013. 212–217

    Google Scholar 

  12. Series B. Methodology for the Subjective Assessment of the Quality of Television Pictures. Recommendation ITU-R BT, 2012. 500–13

    Google Scholar 

  13. Sheikh H R, Wang Z, Cormack L, et al. LIVE image quality assessment database release 2. http://live.ece.utexas.edu/research/quality

  14. Ponomarenko N, Lukin V, Zelensky A, et al. TID2008-a database for evaluation of full-reference visual quality assessment metrics. Adv Modern Radioelectron, 2009, 10: 30–45

    Google Scholar 

  15. Ponomarenko N, Jin L, Ieremeiev O, et al. Image database TID2013: peculiarities, results and perspectives. Signal Process Image Commun, 2015, 30: 57–77

    Google Scholar 

  16. Larson E, Chandler D. Consumer subjective image quality database. 2009. http://visionokstate.edu/csiq/

    Google Scholar 

  17. Le Callet P, Autrusseau F. Subjective quality assessment IRCCyN/IVC database. http://ivc.univ-nantes.fr/en/databases/SubjectiveDatabase/

  18. Horita Y, Shibata K, Kawayoke Y, et al. Mict image quality evaluation database. http://mict.eng.u-toyama.ac.jp/mictdb.html

  19. Chandler D M, Hemami S S. VSNR: a wavelet-based visual signal-to-noise ratio for natural images. IEEE Trans Image Process, 2007, 16: 2284–2298

    MathSciNet  Google Scholar 

  20. Ma K, Duanmu Z, Wu Q, et al. Waterloo exploration database: new challenges for image quality assessment models. IEEE Trans Image Process, 2017, 26: 1004–1016

    MathSciNet  MATH  Google Scholar 

  21. Benoit A, Le Callet P, Campisi P, et al. Quality assessment of stereoscopic images. EURASIP J Image Video Process, 2009, 2008: 659024

    Google Scholar 

  22. Chen M J, Su C C, Kwon D K, et al. Full-reference quality assessment of stereopairs accounting for rivalry. Signal Process Image Commun, 2013, 28: 1143–1155

    Google Scholar 

  23. Chen M J, Cormack L K, Bovik A C. No-reference quality assessment of natural stereopairs. IEEE Trans Image Process, 2013, 22: 3379–3391

    MathSciNet  MATH  Google Scholar 

  24. Wang J, Rehman A, Zeng K, et al. Quality prediction of asymmetrically distorted stereoscopic 3D images. IEEE Trans Image Process, 2015, 24: 3400–3414

    MathSciNet  MATH  Google Scholar 

  25. Wang X, Yu M, Yang Y, et al. Research on subjective stereoscopic image quality assessment. In: Proceedings of the SPIE, 2009

    Google Scholar 

  26. Zhou J, Jiang G, Mao X, et al. Subjective quality analyses of stereoscopic images in 3DTV system. In: Proceedings of IEEE International Conference on Visual Communications and Image Processing, 2011. 1–4

    Google Scholar 

  27. Yang J, Hou C, Zhou Y, et al. Objective quality assessment method of stereo images. In: Proceedings of 3DTV Conference: the True Vision-Capture, Transmission and Display of 3D Video, 2009. 1–4

    Google Scholar 

  28. Song R, Ko H, Kuo C C J. MCL-3D: a database for stereoscopic image quality assessment using 2d-image-plus-depth source. J Inf Sci Eng, 2015, 31: 1593–1611

    Google Scholar 

  29. Jung Y J, Sohn H, Lee S I, et al. Predicting visual discomfort of stereoscopic images using human attention model. IEEE Trans Circ Syst Video Tech, 2013, 23: 2077–2082

    Google Scholar 

  30. Goldmann L, de Simone F, Ebrahimi T. Impact of acquisition distortion on the quality of stereoscopic images. In: Proceedings of International Workshop on Video Processing and Quality Metrics for Consumer Electronics, 2010

    Google Scholar 

  31. Rubinstein M, Gutierrez D, Sorkine O, et al. Retarget me a benchmark for image retargeting. ACM Trans Graph, 2010, 29: 1–9

    Google Scholar 

  32. Ma L, Lin W, Deng C, et al. Image retargeting quality assessment: a study of subjective scores and objective metrics. IEEE J Sel Top Signal Process, 2012, 6: 626–639

    Google Scholar 

  33. Jayaraman D, Mittal A, Moorthy A K, et al. Objective quality assessment of multiply distorted images. In: Proceedings of the 46th Asilomar Conference on Signals, Systems and Computers (ASILOMAR), 2012. 1693–1697

    Google Scholar 

  34. Gu K, Zhai G, Yang X, et al. Hybrid no-reference quality metric for singly and multiply distorted images. IEEE Trans Broadcast, 2014, 60: 555–567

    Google Scholar 

  35. Sun W, Zhou F, Liao Q. MDID: a multiply distorted image database for image quality assessment. Pattern Recogn, 2017, 61: 153–168

    Google Scholar 

  36. Yang H, Fang Y, Lin W, et al. Subjective quality assessment of screen content images. In: Proceedings of IEEE International Workshop on Quality of Multimedia Experience, 2014. 257–262

    Google Scholar 

  37. Ni Z, Ma L, Zeng H, et al. ESIM: edge similarity for screen content image quality assessment. IEEE Trans Image Process, 2017, 26: 4818–4831

    MathSciNet  Google Scholar 

  38. Min X, Ma K, Gu K, et al. Unified blind quality assessment of compressed natural, graphic, and screen content images. IEEE Trans Image Process, 2017, 26: 5462–5474

    MathSciNet  MATH  Google Scholar 

  39. Gu K, Xu X, Qiao J, et al. Learning a unified blind image quality metric via on-line and off-line big training instances. IEEE Trans Big Data, 2019. doi: 10.1109/TBDATA.2019.2895605

    Google Scholar 

  40. Ghadiyaram D, Bovik A C. Massive online crowdsourced study of subjective and objective picture quality. IEEE Trans Image Process, 2016, 25: 372–387

    MathSciNet  MATH  Google Scholar 

  41. Virtanen T, Nuutinen M, Vaahteranoksa M, et al. CID2013: a database for evaluating no-reference image quality assessment algorithms. IEEE Trans Image Process, 2015, 24: 390–402

    MathSciNet  MATH  Google Scholar 

  42. Yeganeh H, Wang Z. Objective quality assessment of tone-mapped images. IEEE Trans Image Process, 2013, 22: 657–667

    MathSciNet  MATH  Google Scholar 

  43. Kundu D, Ghadiyaram D, Bovik A C, et al. Large-scale crowdsourced study for tone-mapped HDR pictures. IEEE Trans Image Process, 2017, 26: 4725–4740

    MathSciNet  Google Scholar 

  44. Ma K, Zeng K, Wang Z. Perceptual quality assessment for multi-exposure image fusion. IEEE Trans Image Process, 2015, 24: 3345–3356

    MathSciNet  MATH  Google Scholar 

  45. Bosc E, Pepion R, Le Callet P, et al. Towards a new quality metric for 3-D synthesized view assessment. IEEE J Sel Top Signal Process, 2011, 5: 1332–1343

    Google Scholar 

  46. Min X, Zhai G, Gu K, et al. Objective quality evaluation of dehazed images. IEEE Trans Intell Trans Syst, 2019, 20: 2879–2892

    Google Scholar 

  47. Min X, Zhai G, Gu K, et al. Quality evaluation of image dehazing methods using synthetic hazy images. IEEE Trans Mult, 2019, 21: 2319–2333

    Google Scholar 

  48. Ma K, Liu W, Wang Z. Perceptual evaluation of single image dehazing algorithms. In: Proceedings of IEEE International Conference on Image Processing, 2015. 3600–3604

    Google Scholar 

  49. Duan H, Zhai G, Min X, et al. Perceptual quality assessment of omnidirectional images. In: Proceedings of IEEE International Symposium on Circuits and Systems, 2018. 1–5

    Google Scholar 

  50. Sun W, Min X, Zhai G, et al. MC360IQA: a multi-channel CNN for blind 360-degree image quality assessment. IEEE J Sel Top Signal Process, 2020, 14: 64–77

    Google Scholar 

  51. Chen M, Jin Y, Goodall T, et al. Study of 3D virtual reality picture quality. 2019. ArXiv: 191003074

    Google Scholar 

  52. Shao F, Lin W, Gu S, et al. Perceptual full-reference quality assessment of stereoscopic images by considering binocular visual characteristics. IEEE Trans Image Process, 2013, 22: 1940–1953

    MathSciNet  MATH  Google Scholar 

  53. Sun W, Luo W, Min X, et al. Mc360iqa: the multi-channel cnn for blind 360-degree image quality assessment. In: Proceedings of IEEE International Symposium on Circuits and Systems, 2019. 1–5

    Google Scholar 

  54. Liu H, Heynderickx I. Studying the added value of visual attention in objective image quality metrics based on eye movement data. In: Proceedings of IEEE International Conference on Image Processing, 2009. 3097–3100

    Google Scholar 

  55. Alers H, Redi J, Liu H, et al. Studying the effect of optimizing image quality in salient regions at the expense of background content. J Electron Imag, 2013, 22: 043012

    Google Scholar 

  56. Redi J A, Liu H, Zunino R, et al. Interactions of visual attention and quality perception. In: Proceedings of SPIE, 2011. 7865: 78650S

    Google Scholar 

  57. Engelke U, Maeder A, Zepernick H J. Visual attention modelling for subjective image quality databases. In: Proceedings of IEEE International Workshop on Multimedia Signal Processing, 2009. 1–6

    Google Scholar 

  58. Min X, Zhai G, Gao Z, et al. Visual attention data for image quality assessment databases. In: Proceedings of IEEE International Symposium on Circuits and Systems, 2014. 894–897

    Google Scholar 

  59. Wang Z, Bovik A C, Sheikh H R, et al. Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process, 2004, 13: 600–612

    Google Scholar 

  60. Wang Z, Simoncelli E P, Bovik A C. Multiscale structural similarity for image quality assessment. In: Proceedings of IEEE Asilomar Conference on Signals, Systems, and Computers, 2003. 1398–1402

    Google Scholar 

  61. Wang Z, Li Q. Information content weighting for perceptual image quality assessment. IEEE Trans Image Process, 2011, 20: 1185–1198

    MathSciNet  MATH  Google Scholar 

  62. Tan H L, Li Z G, Tan Y H, et al. A perceptually relevant MSE-based image quality metric. IEEE Trans Image Process, 2013, 22: 4447–4459

    MathSciNet  MATH  Google Scholar 

  63. Wu J J, Lin W S, Shi G M, et al. Perceptual quality metric with internal generative mechanism. IEEE Trans Image Process, 2013, 22: 43–54

    MathSciNet  MATH  Google Scholar 

  64. Zhang L, Zhang L, Mou X Q, et al. FSIM: a feature similarity index for image quality assessment. IEEE Trans Image Process, 2011, 20: 2378–2386

    MathSciNet  MATH  Google Scholar 

  65. Gu K, Li L, Lu H, et al. A fast reliable image quality predictor by fusing micro- and macro-structures. IEEE Trans Ind Electron, 2017, 64: 3903–3912

    Google Scholar 

  66. Liu A M, Lin W S, Narwaria M. Image quality assessment based on gradient similarity. IEEE Trans Image Process, 2012, 21: 1500–1512

    MathSciNet  MATH  Google Scholar 

  67. Xue W, Zhang L, Mou X, et al. Gradient magnitude similarity deviation: a highly efficient perceptual image quality index. IEEE Trans Image Process, 2014, 23: 684–695

    MathSciNet  MATH  Google Scholar 

  68. Zhu J Y, Wang N C. Image quality assessment by visual gradient similarity. IEEE Trans Image Process, 2012, 21: 919–933

    MathSciNet  MATH  Google Scholar 

  69. Zhan Y, Zhang R, Wu Q. A structural variation classification model for image quality assessment. IEEE Trans Mult, 2017, 19: 1837–1847

    Google Scholar 

  70. Zhang M, Mou X, Zhang L. Non-shift edge based ratio (NSER): an image quality assessment metric based on early vision features. IEEE Signal Process Lett, 2011, 18: 315–318

    Google Scholar 

  71. Capodiferro L, Jacovitti G, Di Claudio E D. Two-dimensional approach to full-reference image quality assessment based on positional structural information. IEEE Trans Image Process, 2012, 21: 505–516

    MathSciNet  MATH  Google Scholar 

  72. Di Claudio E D, Jacovitti G. A detail-based method for linear full reference image quality prediction. IEEE Trans Image Process, 2018, 27: 179–193

    MathSciNet  MATH  Google Scholar 

  73. Ding L, Huang H, Zang Y. Image quality assessment using directional anisotropy structure measurement. IEEE Trans Image Process, 2017, 26: 1799–1809

    MathSciNet  MATH  Google Scholar 

  74. Sun W, Liao Q, Xue J H, et al. SPSIM: a superpixel-based similarity index for full-reference image quality assessment. IEEE Trans Image Process, 2018, 27: 4232–4244

    MathSciNet  MATH  Google Scholar 

  75. Narwaria M, Lin W S. Objective image quality assessment based on support vector regression. IEEE Trans Neural Netw, 2010, 21: 515–519

    Google Scholar 

  76. Shnayderman A, Gusev A, Eskicioglu A M. An SVD-based grayscale image quality measure for local and global assessment. IEEE Trans Image Process, 2006, 15: 422–429

    Google Scholar 

  77. Liu T J, Liu K H, Lin J Y, et al. A paraboost method to image quality assessment. IEEE Trans Neural Netw Learn Syst, 2017, 28: 107–121

    Google Scholar 

  78. He L, Wang D, Liu Q, et al. Fast image quality assessment via supervised iterative quantization method. Neurocomputing, 2016, 212: 121–127

    Google Scholar 

  79. Peng P, Li Z N. General-purpose image quality assessment based on distortion-aware decision fusion. Neurocomputing, 2014, 134: 117–121

    Google Scholar 

  80. Chang H W, Yang H, Gan Y, et al. Sparse feature fidelity for perceptual image quality assessment. IEEE Trans Image Process, 2013, 22: 4007–4018

    MathSciNet  MATH  Google Scholar 

  81. Li L, Cai H, Zhang Y, et al. Sparse representation-based image quality index with adaptive sub-dictionaries. IEEE Trans Image Process, 2016, 25: 3775–3786

    MathSciNet  MATH  Google Scholar 

  82. Yuan Y, Guo Q, Lu X. Image quality assessment: a sparse learning way. Neurocomputing, 2015, 159: 227–241

    Google Scholar 

  83. Ahar A, Barri A, Schelkens P. From sparse coding significance to perceptual quality: a new approach for image quality assessment. IEEE Trans Image Process, 2018, 27: 879–893

    MathSciNet  MATH  Google Scholar 

  84. Pang Y, Sun M, Jiang X, et al. Convolution in convolution for network in network. IEEE Trans Neural Netw Learn Syst, 2017, 29: 1587–1597

    MathSciNet  Google Scholar 

  85. Pang Y, Cao J, Wang J, et al. JCS-Net: joint classification and super-resolution network for small-scale pedestrian detection in surveillance images. IEEE Trans Inform Forensic Secur, 2019, 14: 3322–3331

    Google Scholar 

  86. Zhao C R, Chen K, Zang D, et al. Uncertainty-optimized deep learning model for small-scale person re-identification. Sci China Inf Sci, 2019, 62: 220102

    Google Scholar 

  87. Chen J, Lian Z H, Wang Y Z, et al. Irregular scene text detection via attention guided border labeling. Sci China Inf Sci, 2019, 62: 220103

    Google Scholar 

  88. Liu B, Chen X, Han Y, et al. Accelerating DNN-based 3D point cloud processing for mobile computing. Sci China Inf Sci, 2019, 62: 212206

    MathSciNet  Google Scholar 

  89. Zhu J, Zeng H, Jin X, et al. Joint horizontal and vertical deep learning feature for vehicle re-identification. Sci China Inf Sci, 2019, 62: 199101

    Google Scholar 

  90. Gao F, Wang Y, Li P, et al. DeepSim: deep similarity for image quality assessment. Neurocomputing, 2017, 257: 104–114

    Google Scholar 

  91. Wang H, Fu J, Lin W, et al. Image quality assessment based on local linear information and distortion-specific compensation. IEEE Trans Image Process, 2017, 26: 915–926

    MathSciNet  MATH  Google Scholar 

  92. Kim J, Lee S. Deep learning of human visual sensitivity in image quality assessment framework. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2017. 1969–1977

    Google Scholar 

  93. Bosse S, Maniry D, Muller K R, et al. Deep neural networks for no-reference and full-reference image quality assessment. IEEE Trans Image Process, 2018, 27: 206–219

    MathSciNet  MATH  Google Scholar 

  94. Sheikh H R, Bovik A C, Veciana G D. An information fidelity criterion for image quality assessment using natural scene statistics. IEEE Trans Image Process, 2005, 14: 2117–2128

    Google Scholar 

  95. Sheikh H R, Bovik A C. Image information and visual quality. IEEE Trans Image Process, 2006, 15: 430–444

    Google Scholar 

  96. Demirtas A M, Reibman A R, Jafarkhani H. Full-reference quality estimation for images with different spatial resolutions. IEEE Trans Image Process, 2014, 23: 2069–2080

    MathSciNet  MATH  Google Scholar 

  97. Li S, Zhang F, Ma L, et al. Image quality assessment by separately evaluating detail losses and additive impairments. IEEE Trans Mult, 2011, 13: 935–949

    Google Scholar 

  98. Tang C, Yang X, Zhai G. Image quality/distortion metric based on stable model similarity in wavelet domain. J Visual Commun Image Represent, 2014, 25: 1746–1757

    Google Scholar 

  99. Bae S H, Kim M. DCT-QM: a DCT-based quality degradation metric for image quality optimization problems. IEEE Trans Image Process, 2016, 25: 4916–4930

    MathSciNet  MATH  Google Scholar 

  100. Bae S H, Kim M. A novel image quality assessment with globally and locally consilient visual quality perception. IEEE Trans Image Process, 2016, 25: 2392–2406

    MathSciNet  MATH  Google Scholar 

  101. Redi J A, Gastaldo P, Heynderickx I, et al. Color distribution information for the reduced-reference assessment of perceived image quality. IEEE Trans Circ Syst Video Technol, 2010, 20: 1757–1769

    Google Scholar 

  102. Wu J, Lin W, Shi G, et al. Reduced-reference image quality assessment with visual information fidelity. IEEE Trans Mult, 2013, 15: 1700–1705

    Google Scholar 

  103. Decherchi S, Gastaldo P, Zunino R, et al. Circular-ELM for the reduced-reference assessment of perceived image quality. Neurocomputing, 2013, 102: 78–89

    Google Scholar 

  104. Bampis C G, Gupta P, Soundararajan R, et al. SpEED-QA: spatial efficient entropic differencing for image and video quality. IEEE Signal Process Lett, 2017, 24: 1333–1337

    Google Scholar 

  105. Zhang Y, Phan T D, Chandler D M. Reduced-reference image quality assessment based on distortion families of local perceived sharpness. Signal Process Image Commun, 2017, 55: 130–145

    Google Scholar 

  106. Min X, Gu K, Zhai G, et al. Saliency-induced reduced-reference quality index for natural scene and screen content images. Signal Process, 2018, 145: 127–136

    Google Scholar 

  107. Liu Y, Zhai G, Gu K, et al. Reduced-reference image quality assessment in free-energy principle and sparse representation. IEEE Trans Mult, 2018, 20: 379–391

    Google Scholar 

  108. Gao X, Lu W, Li X, et al. Wavelet-based contourlet in quality evaluation of digital images. Neurocomputing, 2008, 72: 378–385

    Google Scholar 

  109. Wang Z, Wu G X, Sheikh H R, et al. Quality-aware images. IEEE Trans Image Process, 2006, 15: 1680–1689

    Google Scholar 

  110. Soundararajan R, Bovik A C. RRED indices: reduced reference entropic differencing for image quality assessment. IEEE Trans Image Process, 2012, 21: 517–526

    MathSciNet  MATH  Google Scholar 

  111. Rehman A, Zhou Wang A. Reduced-reference image quality assessment by structural similarity estimation. IEEE Trans Image Process, 2012, 21: 3378–3389

    MathSciNet  MATH  Google Scholar 

  112. Ma L, Li S, Zhang F, et al. Reduced-reference image quality assessment using reorganized DCT-based image representation. IEEE Trans Mult, 2011, 13: 824–829

    Google Scholar 

  113. Golestaneh S A, Karam L J. Reduced-reference quality assessment based on the entropy of DWT coefficients of locally weighted gradient magnitudes. IEEE Trans Image Process, 2016, 25: 5293–5303

    MathSciNet  MATH  Google Scholar 

  114. Li Q, Wang Z. Reduced-reference image quality assessment using divisive normalization-based image representation. IEEE J Sel Top Signal Process, 2009, 3: 202–211

    Google Scholar 

  115. Gao X B, Lu W, Tao D C, et al. Image quality assessment based on multiscale geometric analysis. IEEE Trans Image Process, 2009, 18: 1409–1423

    MathSciNet  MATH  Google Scholar 

  116. Zhu W, Zhai G, Min X, et al. Multi-channel decomposition in tandem with free-energy principle for reduced-reference image quality assessment. IEEE Trans Mult, 2019, 21: 2334–2346

    Google Scholar 

  117. Wang Z, Bovik A C, Evan B. Blind measurement of blocking artifacts in images. In: Proceedings of IEEE International Conference on Image Processing, 2000. 3: 981–984

    Google Scholar 

  118. Lee S, Park S J. A new image quality assessment method to detect and measure strength of blocking artifacts. Signal Process Image Commun, 2012, 27: 31–38

    Google Scholar 

  119. Liu H, Heynderickx I. A no-reference perceptual blockiness metric. In: Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing, 2008. 865–868

    Google Scholar 

  120. Liu H, Heynderickx I. A perceptually relevant no-reference blockiness metric based on local image characteristics. EURASIP J Adv Signal Process, 2009, 2009: 263540

    MATH  Google Scholar 

  121. Pan F, Lin X, Rahardja S, et al. Using edge direction information for measuring blocking artifacts of images. Multidim Syst Sign Process, 2007, 18: 297–308

    MathSciNet  MATH  Google Scholar 

  122. Li L, Lin W, Zhu H. Learning structural regularity for evaluating blocking artifacts in JPEG images. IEEE Signal Process Lett, 2014, 21: 918–922

    Google Scholar 

  123. Li L, Zhou Y, Wu J, et al. GridSAR: grid strength and regularity for robust evaluation of blocking artifacts in JPEG images. J Visual Commun Image Represent, 2015, 30: 153–163

    Google Scholar 

  124. Min X, Zhai G, Gu K, et al. Blind quality assessment of compressed images via pseudo structural similarity. In: Proceedings of IEEE International Conference on Multimedia and Expo, 2016. 1–6

    Google Scholar 

  125. Wang Z, Sheikh H R, Bovik A C. No-reference perceptual quality assessment of JPEG compressed images. In: Proceedings of IEEE International Conference on Image Processing, 2002. 477–480

    Google Scholar 

  126. Perra C, Massidda F, Giusto D D. Image blockiness evaluation based on sobel operator. In: Proceedings of IEEE International Conference on Image Processing, 2005. 389

    Google Scholar 

  127. Zhan Y, Zhang R. No-reference JPEG image quality assessment based on blockiness and luminance change. IEEE Signal Process Lett, 2017, 24: 760–764

    Google Scholar 

  128. Gastaldo P, Parodi G, Redi J, et al. No-reference quality assessment of JPEG images by using CBP neural networks. In: Proceedings of International Conference on Artificial Neural Networks, 2007. 564–572

    Google Scholar 

  129. Ridella S, Rovetta S, Zunino R. Circular backpropagation networks for classification. IEEE Trans Neural Netw, 1997, 8: 84–97

    Google Scholar 

  130. Bovik A C, Liu S. DCT-domain blind measurement of blocking artifacts in DCT-coded images. In: Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, 2001. 3: 1725–1728

    Google Scholar 

  131. Chen C, Bloom J A. A blind reference-free blockiness measure. In: Proceedings of Pacific-Rim Conference on Multimedia, 2010. 112–123

    Google Scholar 

  132. Golestaneh S A, Chandler D M. No-reference quality assessment of JPEG images via a quality relevance map. IEEE Signal Process Lett, 2014, 21: 155–158

    Google Scholar 

  133. Li L, Zhu H, Yang G, et al. Referenceless measure of blocking artifacts by tchebichef kernel analysis. IEEE Signal Process Lett, 2014, 21: 122–125

    Google Scholar 

  134. Ci W, Dong H, Wu Z, et al. Example-based objective quality estimation for compressed images. IEEE MultiMedia, 2019. doi: 10.1109/MMUL.2009.77

    Google Scholar 

  135. Wang C, Shen M, Yao C. No-reference quality assessment for DCT-based compressed image. J Visual Commun Image Represent, 2015, 28: 53–59

    Google Scholar 

  136. Liu C, Freeman W T, Szeliski R, et al. Noise estimation from a single image. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2006. 901–908

    Google Scholar 

  137. Zoran D, Weiss Y. Scale invariance and noise in natural images. In: Proceedings of IEEE International Conference on Computer Vision, 2009. 2209–2216

    Google Scholar 

  138. Zhai G, Wu X. Noise estimation using statistics of natural images. In: Proceedings of IEEE International Conference on Image Processing, 2011. 1857–1860

    Google Scholar 

  139. Tang C, Yang X, Zhai G. Noise estimation of natural images via statistical analysis and noise injection. IEEE Trans Circ Syst Video Technol, 2015, 25: 1283–1294

    Google Scholar 

  140. Dong L, Zhou J, Tang Y Y. Effective and fast estimation for image sensor noise via constrained weighted least squares. IEEE Trans Image Process, 2018, 27: 2715–2730

    MathSciNet  MATH  Google Scholar 

  141. Zhai G, Wu X. On monotonicity of image quality metrics. In: Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing, 2012. 1157–1160

    Google Scholar 

  142. Zhai G, Kaup A, Wang J, et al. A dual-model approach to blind quality assessment of noisy images. APSIPA Trans Signal Inf Process, 2015, 4: 29–32

    Google Scholar 

  143. Marziliano P, Dufaux F, Winkler S, et al. A no-reference perceptual blur metric. In: Proceedings of IEEE International Conference on Image Processing, 2002

    Google Scholar 

  144. Ong E, Lin W, Lu Z, et al. A no-reference quality metric for measuring image blur. In: Proceedings of International Symposium on Signal Processing and its Applications, 2003. 469–472

    Google Scholar 

  145. Ferzli R, Karam L J. A no-reference objective image sharpness metric based on the notion of just noticeable blur (JNB). IEEE Trans Image Process, 2009, 18: 717–728

    MathSciNet  MATH  Google Scholar 

  146. Narvekar N D, Karam L J. A no-reference image blur metric based on the cumulative probability of blur detection (CPBD). IEEE Trans Image Process, 2011, 20: 2678–2683

    MathSciNet  MATH  Google Scholar 

  147. Feichtenhofer C, Fassold H, Schallauer P. A perceptual image sharpness metric based on local edge gradient analysis. IEEE Signal Process Lett, 2013, 20: 379–382

    Google Scholar 

  148. Bahrami K, Kot A C. A fast approach for no-reference image sharpness assessment based on maximum local variation. IEEE Signal Process Lett, 2014, 21: 751–755

    Google Scholar 

  149. Gu K, Zhai G T, Lin W S, et al. No-reference image sharpness assessment in autoregressive parameter space. IEEE Trans Image Process, 2015, 24: 3218–3231

    MathSciNet  MATH  Google Scholar 

  150. Li L, Wu D, Wu J, et al. Image sharpness assessment by sparse representation. IEEE Trans Mult, 2016, 18: 1085–1097

    MathSciNet  Google Scholar 

  151. Marichal X, Ma W Y, Zhang H. Blur determination in the compressed domain using DCT information. In: Proceedings of IEEE International Conference on Image Processing, 1999. 386–390

    Google Scholar 

  152. Shaked D, Tastl I. Sharpness measure: towards automatic image enhancement. In: Proceedings of IEEE International Conference on Image Processing, 2005. 937

    Google Scholar 

  153. Vu P V, Chandler D M. A fast wavelet-based algorithm for global and local image sharpness estimation. IEEE Signal Process Lett, 2012, 19: 423–426

    Google Scholar 

  154. Hassen R, Wang Zhou, Salama M M A. Image sharpness assessment based on local phase coherence. IEEE Trans Image Process, 2013, 22: 2798–2810

    Google Scholar 

  155. Oh T, Park J, Seshadrinathan K, et al. No-reference sharpness assessment of camera-shaken images by analysis of spectral structure. IEEE Trans Image Process, 2014, 23: 5428–5439

    MathSciNet  MATH  Google Scholar 

  156. Caviedes J, Oberti F. A new sharpness metric based on local kurtosis, edge and energy information. Signal Process Image Commun, 2004, 19: 147–161

    Google Scholar 

  157. Ciancio A, da Costa A L N T, da Silva E A B, et al. No-reference blur assessment of digital pictures based on multifeature classifiers. IEEE Trans Image Process, 2011, 20: 64–75

    MathSciNet  MATH  Google Scholar 

  158. Vu C T, Phan T D, Chandler D M. S3: a spectral and spatial measure of local perceived sharpness in natural images. IEEE Trans Image Process, 2012, 21: 934–945

    MathSciNet  MATH  Google Scholar 

  159. Li L, Xia W, Lin W, et al. No-reference and robust image sharpness evaluation based on multiscale spatial and spectral features. IEEE Trans Mult, 2017, 19: 1030–1040

    Google Scholar 

  160. Marziliano P, Dufaux F, Winkler S, et al. Perceptual blur and ringing metrics: application to JPEG2000. Signal Process Image Commun, 2004, 19: 163–172

    Google Scholar 

  161. Sheikh H R, Bovik A C, Cormack L. No-reference quality assessment using natural scene statistics: JPEG2000. IEEE Trans Image Process, 2005, 14: 1918–1927

    Google Scholar 

  162. Sazzad Z P, Kawayoke Y, Horita Y. Spatial features based no reference image quality assessment for JPEG2000. In: Proceedings of IEEE International Conference on Image Processing, 2007. 517

    Google Scholar 

  163. Sazzad Z M P, Kawayoke Y, Horita Y. No reference image quality assessment for JPEG2000 based on spatial features. Signal Process Image Commun, 2008, 23: 257–268

    Google Scholar 

  164. Zhang J, Le T. A new no-reference quality metric for JPEG2000 images. IEEE Trans Consumer Electron, 2010, 56: 743–750

    Google Scholar 

  165. Liu H, Klomp N, Heynderickx I. A no-reference metric for perceived ringing artifacts in images. IEEE Trans Circ Syst Video Technol, 2010, 20: 529–539

    Google Scholar 

  166. Liang L, Wang S, Chen J, et al. No-reference perceptual image quality metric using gradient profiles for JPEG2000. Signal Process Image Commun, 2010, 25: 502–516

    Google Scholar 

  167. Zhang J, Ong S H, Le T M. Kurtosis-based no-reference quality assessment of JPEG2000 images. Signal Process Image Commun, 2011, 26: 13–23

    Google Scholar 

  168. Moorthy A K, Bovik A C. A two-step framework for constructing blind image quality indices. IEEE Signal Process Lett, 2010, 17: 513–516

    Google Scholar 

  169. Moorthy A K, Bovik A C. Blind image quality assessment: from natural scene statistics to perceptual quality. IEEE Trans Image Process, 2011, 20: 3350–3364

    MathSciNet  MATH  Google Scholar 

  170. Tang H, Joshi N, Kapoor A. Learning a blind measure of perceptual image quality. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2011. 305–312

    Google Scholar 

  171. Gao X B, Gao F, Tao D C, et al. Universal blind image quality assessment metrics via natural scene statistics and multiple kernel learning. IEEE Trans Neural Netw Learn Syst, 2013, 24: 2013–2026

    Google Scholar 

  172. Zhang Y, Moorthy A K, Chandler D M, et al. C-DIIVINE: no-reference image quality assessment based on local magnitude and phase statistics of natural scenes. Signal Process Image Commun, 2014, 29: 725–747

    Google Scholar 

  173. Wang Q, Chu J, Xu L, et al. A new blind image quality framework based on natural color statistic. Neurocomputing, 2016, 173: 1798–1810

    Google Scholar 

  174. Saad M A, Bovik A C, Charrier C. A DCT statistics-based blind image quality index. IEEE Signal Process Lett, 2010, 17: 583–586

    Google Scholar 

  175. Saad M A, Bovik A C, Charrier C. Blind image quality assessment: a natural scene statistics approach in the DCT domain. IEEE Trans Image Process, 2012, 21: 3339–3352

    MathSciNet  MATH  Google Scholar 

  176. Mittal A, Moorthy A K, Bovik A C. No-reference image quality assessment in the spatial domain. IEEE Trans Image Process, 2012, 21: 4695–4708

    MathSciNet  MATH  Google Scholar 

  177. Mittal A, Soundararajan R, Bovik A C. Making a “completely blind” image quality analyzer. IEEE Signal Process Lett, 2013, 20: 209–212

    Google Scholar 

  178. Zhang L, Zhang L, Bovik A C. A feature-enriched completely blind image quality evaluator. IEEE Trans Image Process, 2015, 24: 2579–2591

    MathSciNet  MATH  Google Scholar 

  179. Xue W, Mou X, Zhang L, et al. Blind image quality assessment using joint statistics of gradient magnitude and laplacian features. IEEE Trans Image Process, 2014, 23: 4850–4862

    MathSciNet  MATH  Google Scholar 

  180. Lee D, Plataniotis K N. Toward a no-reference image quality assessment using statistics of perceptual color descriptors. IEEE Trans Image Process, 2016, 25: 3875–3889

    MathSciNet  MATH  Google Scholar 

  181. Mittal A, Moorthy A K, Bovik A C. Making image quality assessment robust. In: Proceedings of the 46th Asilomar Conference on Signals, Systems and Computers (ASILOMAR), 2012. 1718–1722

    Google Scholar 

  182. Wu Q, Wang Z, Li H. A highly efficient method for blind image quality assessment. In: Proceedings of IEEE International Conference on Image Processing, 2015. 339–343

    Google Scholar 

  183. Zhang M, Muramatsu C, Zhou X, et al. Blind image quality assessment using the joint statistics of generalized local binary pattern. IEEE Signal Process Lett, 2015, 22: 207–210

    Google Scholar 

  184. Lu W, Zeng K, Tao D, et al. No-reference image quality assessment in contourlet domain. Neurocomputing, 2010, 73: 784–794

    Google Scholar 

  185. Shen J, Li Q, Erlebacher G. Hybrid no-reference natural image quality assessment of noisy, blurry, JPEG2000, and JPEG images. IEEE Trans Image Process, 2011, 20: 2089–2098

    MathSciNet  MATH  Google Scholar 

  186. Zhang Y, Chandler D M. No-reference image quality assessment based on log-derivative statistics of natural scenes. J Electron Imag, 2013, 22: 043025

    Google Scholar 

  187. Ye P, Doermann D. No-reference image quality assessment based on visual codebook. In: Proceedings of IEEE International Conference on Image Processing, 2011. 3089–3092

    Google Scholar 

  188. Ye P, Doermann D. No-reference image quality assessment using visual codebooks. IEEE Trans Image Process, 2012, 21: 3129–3138

    MathSciNet  MATH  Google Scholar 

  189. Ye P, Kumar J, Kang L, et al. Unsupervised feature learning framework for no-reference image quality assessment. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2012. 1098–1105

    Google Scholar 

  190. Ye P, Kumar J, Kang L, et al. Real-time no-reference image quality assessment based on filter learning. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2013. 987–994

    Google Scholar 

  191. Xue W, Zhang L, Mou X. Learning without human scores for blind image quality assessment. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2013. 995–1002

    Google Scholar 

  192. Ye P, Kumar J, Doermann D. Beyond human opinion scores: blind image quality assessment based on synthetic scores. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2014. 4241–4248

    Google Scholar 

  193. Xu J, Ye P, Li Q, et al. Blind image quality assessment based on high order statistics aggregation. IEEE Trans Image Process, 2016, 25: 4444–4457

    MathSciNet  MATH  Google Scholar 

  194. Zhang P, Zhou W, Wu L, et al. SOM: semantic obviousness metric for image quality assessment. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2015. 2394–2402

    Google Scholar 

  195. Mittal A, Muralidhar G S, Ghosh J, et al. Blind image quality assessment without human training using latent quality factors. IEEE Signal Process Lett, 2012, 19: 75–78

    Google Scholar 

  196. Jiang Q, Shao F, Jiang G, et al. Supervised dictionary learning for blind image quality assessment using qualityconstraint sparse coding. J Visual Commun Image Represent, 2015, 33: 123–133

    Google Scholar 

  197. Xie X, Zhang Y, Wu J, et al. Bag-of-words feature representation for blind image quality assessment with local quantized pattern. Neurocomputing, 2017, 266: 176–187

    Google Scholar 

  198. Jiang Q, Shao F, Lin W, et al. Optimizing multistage discriminative dictionaries for blind image quality assessment. IEEE Trans Mult, 2018, 20: 2035–2048

    Google Scholar 

  199. He L, Tao D, Li X, et al. Sparse representation for blind image quality assessment. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2012. 1146–1153

    Google Scholar 

  200. Zhang C, Pan J, Chen S, et al. No reference image quality assessment using sparse feature representation in two dimensions spatial correlation. Neurocomputing, 2016, 173: 462–470

    Google Scholar 

  201. Wu Q, Li H, Meng F, et al. Blind image quality assessment based on multichannel feature fusion and label transfer. IEEE Trans Circ Syst Video Technol, 2016, 26: 425–440

    Google Scholar 

  202. Wu Q, Li H, Meng F, et al. No reference image quality assessment metric via multi-domain structural information and piecewise regression. J Visual Commun Image Represent, 2015, 32: 205–216

    Google Scholar 

  203. Wu Q, Li H, Ngan K N, et al. Blind image quality assessment using local consistency aware retriever and uncertainty aware evaluator. IEEE Trans Circ Syst Video Tech, 2018, 28: 2078–2089

    Google Scholar 

  204. Fang R, Al-Bayaty R, Wu D. BNB method for no-reference image quality assessment. IEEE Trans Circ Syst Video Technol, 2017, 27: 1381–1391

    Google Scholar 

  205. Gao F, Tao D, Gao X, et al. Learning to rank for blind image quality assessment. IEEE Trans Neural Netw Learn Syst, 2015, 26: 2275–2290

    MathSciNet  Google Scholar 

  206. Ma K, Liu W, Liu T, et al. dipIQ: blind image quality assessment by learning-to-rank discriminable image pairs. IEEE Trans Image Process, 2017, 26: 3951–3964

    MathSciNet  MATH  Google Scholar 

  207. Burges C, Shaked T, Renshaw E, et al. Learning to rank using gradient descent. In: Proceedings of International Conference on Machine Learning, 2005. 89–96

    Google Scholar 

  208. Xu L, Li J, Lin W, et al. Multi-task rank learning for image quality assessment. IEEE Trans Circ Syst Video Technol, 2017, 27: 1833–1843

    Google Scholar 

  209. Pang Y, Zhou B, Nie F. Simultaneously learning neighborship and projection matrix for supervised dimensionality reduction. IEEE Trans Neural Netw Learn Syst, 2019, 30: 2779–2793

    MathSciNet  Google Scholar 

  210. Pang Y, Cao J, Li X. Cascade learning by optimally partitioning. IEEE Trans Cyber, 2016, 47: 4148–4161

    Google Scholar 

  211. Han Z Y, Wu H B, Wei B Z, et al. Recursive narrative alignment for movie narrating. Sci China Inf Sci, 2020, 63: 174101

    Google Scholar 

  212. Zhang W T, Jiang J W, Shao Y X, et al. Snapshot boosting: a fast ensemble framework for deep neural networks. Sci China Inf Sci, 2020, 63: 112102

    Google Scholar 

  213. Habimana O, Li Y H, Li R H, et al. Sentiment analysis using deep learning approaches: an overview. Sci China Inf Sci, 2020, 63: 111102

    Google Scholar 

  214. Chen S T, Jian Z Q, Huang Y H, et al. Autonomous driving: cognitive construction and situation understanding. Sci China Inf Sci, 2019, 62: 081101

    Google Scholar 

  215. Gu K, Zhai G, Yang X, et al. Deep learning network for blind image quality assessment. In: Proceedings of IEEE International Conference on Image Processing, 2014. 511–515

    Google Scholar 

  216. Li Y, Po L M, Xu X, et al. No-reference image quality assessment with shearlet transform and deep neural networks. Neurocomputing, 2015, 154: 94–109

    Google Scholar 

  217. Lv Y, Jiang G, Yu M, et al. Difference of gaussian statistical features based blind image quality assessment: a deep learning approach. In: Proceedings of IEEE International Conference on Image Processing, 2015. 2344–2348

    Google Scholar 

  218. Li C F, Bovik A C, Wu X J. Blind image quality assessment using a general regression neural network. IEEE Trans Neural Netw, 2011, 22: 793–799

    Google Scholar 

  219. Specht D F. A general regression neural network. IEEE Trans Neural Netw, 1991, 2: 568–576

    Google Scholar 

  220. Tang H, Joshi N, Kapoor A. Blind image quality assessment using semi-supervised rectifier networks. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2014. 2877–2884

    Google Scholar 

  221. Hinton G E. Reducing the dimensionality of data with neural networks. Science, 2006, 313: 504–507

    MathSciNet  MATH  Google Scholar 

  222. Hou W L, Gao X B, Tao D C, et al. Blind image quality assessment via deep learning. IEEE Trans Neural Netw Learn Syst, 2015, 26: 1275–1286

    MathSciNet  Google Scholar 

  223. Kang L, Ye P, Li Y, et al. Convolutional neural networks for no-reference image quality assessment. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2014. 1733–1740

    Google Scholar 

  224. Bosse S, Maniry D, Wiegand T, et al. A deep neural network for image quality assessment. In: Proceedings of IEEE International Conference on Image Processing, 2016. 3773–3777

    Google Scholar 

  225. Kang L, Ye P, Li Y, et al. Simultaneous estimation of image quality and distortion via multi-task convolutional neural networks. In: Proceedings of IEEE International Conference on Image Processing, 2015. 2791–2795

    Google Scholar 

  226. Kim J, Lee S. Fully deep blind image quality predictor. IEEE J Sel Top Signal Process, 2017, 11: 206–220

    Google Scholar 

  227. Gu J, Meng G, Redi J A, et al. Blind image quality assessment via vector regression and object oriented pooling. IEEE Trans Mult, 2018, 20: 1140–1153

    Google Scholar 

  228. Kim J, Nguyen A D, Lee S. Deep CNN-based blind image quality predictor. IEEE Trans Neural Netw Learn Syst, 2019, 30: 11–24

    Google Scholar 

  229. Pan D, Shi P, Hou M, et al. Blind predicting similar quality map for image quality assessment. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2018. 6373–6382

    Google Scholar 

  230. Ma K, Liu W, Zhang K, et al. End-to-end blind image quality assessment using deep neural networks. IEEE Trans Image Process, 2018, 27: 1202–1213

    MathSciNet  MATH  Google Scholar 

  231. Lin K Y, Wang G. Hallucinated-iqa: no-reference image quality assessment via adversarial learning. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2018. 732–741

    Google Scholar 

  232. Liu X, van de Weijer J, Bagdanov A D. Rankiqa: learning from rankings for no-reference image quality assessment. In: Proceedings of IEEE International Conference on Computer Vision, 2017. 1040–1049

    Google Scholar 

  233. Talebi H, Milanfar P. NIMA: neural image assessment. IEEE Trans Image Process, 2018, 27: 3998–4011

    MathSciNet  MATH  Google Scholar 

  234. Guan J, Yi S, Zeng X, et al. Visual importance and distortion guided deep image quality assessment framework. IEEE Trans Mult, 2017, 19: 2505–2520

    Google Scholar 

  235. Zhai G, Wu X, Yang X, et al. A psychovisual quality metric in free-energy principle. IEEE Trans Image Process, 2012, 21: 41–52

    MathSciNet  MATH  Google Scholar 

  236. Zhai G, Min X, Liu N. Free-energy principle inspired visual quality assessment: an overview. Digital Signal Process, 2019, 91: 11–20

    Google Scholar 

  237. Gu K, Zhai G, Yang X, et al. Using free energy principle for blind image quality assessment. IEEE Trans Mult, 2015, 17: 50–63

    Google Scholar 

  238. Li Q, Lin W, Xu J, et al. Blind image quality assessment using statistical structural and luminance features. IEEE Trans Mult, 2016, 18: 2457–2469

    Google Scholar 

  239. Li Q, Lin W, Fang Y. BSD: blind image quality assessment based on structural degradation. Neurocomputing, 2017, 236: 93–103

    Google Scholar 

  240. Min X, Gu K, Zhai G, et al. Blind quality assessment based on pseudo-reference image. IEEE Trans Mult, 2018, 20: 2049–2062

    Google Scholar 

  241. Min X, Zhai G, Gu K, et al. Blind image quality estimation via distortion aggravation. IEEE Trans Broadcast, 2018, 64: 508–517

    Google Scholar 

  242. Wu Q, Li H, Wang Z, et al. Blind image quality assessment based on rank-order regularized regression. IEEE Trans Mult, 2017, 19: 2490–2504

    Google Scholar 

  243. Saha A, Wu Q M J. Utilizing image scales towards totally training free blind image quality assessment. IEEE Trans Image Process, 2015, 24: 1879–1892

    MathSciNet  MATH  Google Scholar 

  244. Liu T J, Liu K H. No-reference image quality assessment by wide-perceptual-domain scorer ensemble method. IEEE Trans Image Process, 2018, 27: 1138–1151

    MathSciNet  MATH  Google Scholar 

  245. Freitas P G, Akamine W Y L, Farias M C Q. No-reference image quality assessment using orthogonal color planes patterns. IEEE Trans Mult, 2018, 20: 3353–3360

    Google Scholar 

  246. Lambooij M, IJsselsteijn W, Bouwhuis D G, et al. Evaluation of stereoscopic images: beyond 2D quality. IEEE Trans Broadcast, 2011, 57: 432–444

    Google Scholar 

  247. Wang J, Wang S, Ma K, et al. Perceptual depth quality in distorted stereoscopic images. IEEE Trans Image Process, 2017, 26: 1202–1215

    MathSciNet  MATH  Google Scholar 

  248. Yun N, Feng Z, Yang J, et al. The objective quality assessment of stereo image. Neurocomputing, 2013, 120: 121–129

    Google Scholar 

  249. You J, Xing L, Perkis A, et al. Perceptual quality assessment for stereoscopic images based on 2D image quality metrics and disparity analysis. In: Proceedings of International Workshop on Video Processing and Quality Metrics for Consumer Electronics, 2010

    Google Scholar 

  250. Akhter R, Sazzad Z M P, Horita Y, et al. No-reference stereoscopic image quality assessment. In: Proceedings of SPIE, 2010. 7524

    Google Scholar 

  251. Hewage C T E R, Martini M G. Reduced-reference quality metric for 3D depth map transmission. In: Proceedings of 3DTV-Conference: the True Vision - Capture, Transmission and Display of 3D Video, 2010. 1–4

    Google Scholar 

  252. Maalouf A, Larabi M C. CYCLOP: a stereo color image quality assessment metric. In: Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing, 2011. 1161–1164

    Google Scholar 

  253. Shao F, Tian W, Lin W, et al. Toward a blind deep quality evaluator for stereoscopic images based on monocular and binocular interactions. IEEE Trans Image Process, 2016, 25: 2059–2074

    MathSciNet  MATH  Google Scholar 

  254. Yang J, Zhao Y, Zhu Y, et al. Blind assessment for stereo images considering binocular characteristics and deep perception map based on deep belief network. Inf Sci, 2019, 474: 1–17

    Google Scholar 

  255. Stidwill D, Fletcher R. Normal Binocular Vision: Theory, Investigation and Practical Aspects. Hoboken: John Wiley & Sons, 2010

    Google Scholar 

  256. Li Z, Atick J J. Efficient stereo coding in the multiscale representation. Netw Comput Neural Syst, 1994, 5: 157–174

    MATH  Google Scholar 

  257. Moorthy A K, Su C C, Mittal A, et al. Subjective evaluation of stereoscopic image quality. Signal Process Image Commun, 2013, 28: 870–883

    Google Scholar 

  258. Gorley P, Holliman N. Stereoscopic image quality metrics and compression. In: Proceedings of SPIE, 2008. 680305

    Google Scholar 

  259. Yasakethu S L P, Hewage C T E R, Fernando W A C, et al. Quality analysis for 3D video using 2D video quality models. IEEE Trans Consumer Electron, 2008, 54: 1969–1976

    Google Scholar 

  260. Meegan D V, Stelmach L B, Tam W J. Unequal weighting of monocular inputs in binocular combination: implications for the compression of stereoscopic imagery. J Exp Psychol Appl, 2001, 7: 143–153

    Google Scholar 

  261. Fang Y, Yan J, Liu X, et al. Stereoscopic image quality assessment by deep convolutional neural network. J Visual Commun Image Represent, 2019, 58: 400–406

    Google Scholar 

  262. Zhou W, Chen Z, Li W. Dual-stream interactive networks for no-reference stereoscopic image quality assessment. IEEE Trans Image Process, 2019, 28: 3946–3958

    MathSciNet  MATH  Google Scholar 

  263. Kumano H, Tanabe S, Fujita I. Spatial frequency integration for binocular correspondence in macaque area V4. J Neuro Physiol, 2008, 99: 402–408

    Google Scholar 

  264. Lin Y H, Wu J L. Quality assessment of stereoscopic 3D image compression by binocular integration behaviors. IEEE Trans Image Process, 2014, 23: 1527–1542

    MathSciNet  MATH  Google Scholar 

  265. Jiang G, Xu H, Yu M, et al. Stereoscopic image quality assessment by learning non-negative matrix factorizationbased color visual characteristics and considering binocular interactions. J Visual Commun Image Represent, 2017, 46: 269–279

    Google Scholar 

  266. Kingdom F A A. Binocular vision: the eyes add and subtract. Curr Biol, 2012, 22: 22–24

    Google Scholar 

  267. Yang J, Liu Y, Gao Z, et al. A perceptual stereoscopic image quality assessment model accounting for binocular combination behavior. J Visual Commun Image Represent, 2015, 31: 138–145

    Google Scholar 

  268. Lin C, Chen Z, Liao N. Full-reference quality assessment for stereoscopic images based on binocular vision model. In: Proceedings of IEEE International Conference on Visual Communications and Image Processing, 2016. 1–4

    Google Scholar 

  269. Qian N, Mikaelian S. Relationship between phase and energy methods for disparity computation. Neural Comput, 2000, 12: 279–292

    Google Scholar 

  270. Field D J. Relations between the statistics of natural images and the response properties of cortical cells. J Opt Soc Am Opt Image Sci, 1987, 4: 2379

    Google Scholar 

  271. Lin Y, Yang J, Lu W, et al. Quality index for stereoscopic images by jointly evaluating cyclopean amplitude and cyclopean phase. IEEE J Sel Top Signal Process, 2017, 11: 89–101

    Google Scholar 

  272. Kruger N, Janssen P, Kalkan S, et al. Deep hierarchies in the primate visual cortex: what can we learn for computer vision? IEEE Trans Pattern Anal Mach Intell, 2013, 35: 1847–1871

    Google Scholar 

  273. Barlow H B. Foundations of cyclopean perception. Behav Sci, 1974, 17: 310–312

    Google Scholar 

  274. Grossberg S, Kelly F. Neural dynamics of binocular brightness perception. Vision Res, 1999, 39: 3796–3816

    Google Scholar 

  275. Liu Y, Yang J, Meng Q, et al. Stereoscopic image quality assessment method based on binocular combination saliency model. Signal Process, 2016, 125: 237–248

    Google Scholar 

  276. Zhou W, Jiang G, Yu M, et al. PMFS: a perceptual modulated feature similarity metric for stereoscopic image quality assessment. IEEE Signal Process Lett, 2014, 21: 1003–1006

    Google Scholar 

  277. Li S, Han X, Chang Y. Adaptive cyclopean image-based stereoscopic image-quality assessment using ensemble learning. IEEE Trans Mult, 2019, 21: 2616–2624

    Google Scholar 

  278. Md S K, Appina B, Channappayya S S. Full-reference stereo image quality assessment using natural stereo scene statistics. IEEE Signal Process Lett, 2015, 22: 1985–1989

    Google Scholar 

  279. Ko H, Song R, Jay Kuo C C. A ParaBoost stereoscopic image quality assessment (PBSIQA) system. J Visual Commun Image Represent, 2017, 45: 156–169

    Google Scholar 

  280. Wang X, Liu Q, Wang R, et al. Natural image statistics based 3D reduced reference image quality assessment in contourlet domain. Neurocomputing, 2015, 151: 683–691

    Google Scholar 

  281. Su C C, Cormack L K, Bovik A C. Oriented correlation models of distorted natural images with application to natural stereopair quality evaluation. IEEE Trans Image Process, 2015, 24: 1685–1699

    MathSciNet  MATH  Google Scholar 

  282. Ma L, Wang X, Liu Q, et al. Reorganized DCT-based image representation for reduced reference stereoscopic image quality assessment. Neurocomputing, 2016, 215: 21–31

    Google Scholar 

  283. Zhou W, Yu L. Binocular responses for no-reference 3D image quality assessment. IEEE Trans Mult, 2016, 18: 1077–1084

    Google Scholar 

  284. Zhou W, Qiu W, Wu M W. Utilizing dictionary learning and machine learning for blind quality assessment of 3-D images. IEEE Trans Broadcast, 2017, 63: 404–415

    Google Scholar 

  285. Olshausen B A, Field D J. Sparse coding with an overcomplete basis set: a strategy employed by V1? Vision Res, 1997, 37: 3311–3325

    Google Scholar 

  286. Elad M. Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing. Berlin: Springer, 2010

    MATH  Google Scholar 

  287. Shao F, Lin W S, Wang S S, et al. Blind image quality assessment for stereoscopic images using binocular guided quality lookup and visual codebook. IEEE Trans Broadcast, 2015, 61: 154–165

    Google Scholar 

  288. Shao F, Li K, Lin W, et al. Full-reference quality assessment of stereoscopic images by learning binocular receptive field properties. IEEE Trans Image Process, 2015, 24: 2971–2983

    MathSciNet  MATH  Google Scholar 

  289. Qi F, Zhao D, Gao W. Reduced reference stereoscopic image quality assessment based on binocular perceptual information. IEEE Trans Mult, 2015, 17: 2338–2344

    Google Scholar 

  290. Shao F, Li K, Lin W, et al. Learning blind quality evaluator for stereoscopic images using joint sparse representation. IEEE Trans Mult, 2016, 18: 2104–2114

    Google Scholar 

  291. Shao F, Li K, Lin W, et al. Using binocular feature combination for blind quality assessment of stereoscopic images. IEEE Signal Process Lett, 2015, 22: 1548–1551

    Google Scholar 

  292. Shao F, Lin W, Wang S, et al. Learning receptive fields and quality lookups for blind quality assessment of stereoscopic images. IEEE Trans Cybern, 2016, 46: 730–743

    Google Scholar 

  293. Shao F, Zhang Z, Jiang Q, et al. Toward domain transfer for no-reference quality prediction of asymmetrically distorted stereoscopic images. IEEE Trans Circ Syst Video Technol, 2018, 28: 573–585

    Google Scholar 

  294. Shao F, Tian W, Lin W, et al. Learning sparse representation for no-reference quality assessment of multiply distorted stereoscopic images. IEEE Trans Mult, 2017, 19: 1821–1836

    Google Scholar 

  295. Vu C T, Larson E C, Chandler D M. Visual fixation patterns when judging image quality: effects of distortion type, amount, and subject experience. In: Proceedings of IEEE Southwest Symposium on Image Analysis and Interpretation, 2008. 73–76

    Google Scholar 

  296. Liu H T, Heynderickx I. Visual attention in objective image quality assessment: based on eye-tracking data. IEEE Trans Circ Syst Video Technol, 2011, 21: 971–982

    Google Scholar 

  297. Liu H T, Engelke U, Wang J, et al. How does image content affect the added value of visual attention in objective image quality assessment? IEEE Signal Process Lett, 2013, 20: 355–358

    Google Scholar 

  298. Wang Q, Xu L, Chen Q, et al. Import of distortion on saliency applied to image quality assessment. In: Proceedings of IEEE International Conference on Image Processing, 2014. 1165–1169

    Google Scholar 

  299. Rai Y, Callet P L, Guillotel P. Which saliency weighting for omni directional image quality assessment? In: Proceedings of IEEE International Conference on Quality of Multimedia Experience, 2017. 1–6

    Google Scholar 

  300. Zhang W, Liu H. Toward a reliable collection of eye-tracking data for image quality research: challenges, solutions, and applications. IEEE Trans Image Process, 2017, 26: 2424–2437

    MathSciNet  MATH  Google Scholar 

  301. Ma Q, Zhang L. Image quality assessment with visual attention. In: Proceedings of IEEE International Conference on Pattern Recognition, 2008. 1–4

    Google Scholar 

  302. Zhang W, Talens-Noguera J V, Liu H. The quest for the integration of visual saliency models in objective image quality assessment: a distraction power compensated combination strategy. In: Proceedings of IEEE International Conference on Image Processing, 2015. 1250–1254

    Google Scholar 

  303. Zhang W, Borji A, Wang Z, et al. The application of visual saliency models in objective image quality assessment: a statistical evaluation. IEEE Trans Neural Netw Learn Syst, 2016, 27: 1266–1278

    MathSciNet  Google Scholar 

  304. Wen Y, Li Y, Zhang X, et al. A weighted full-reference image quality assessment based on visual saliency. J Visual Commun Image Represent, 2017, 43: 119–126

    Google Scholar 

  305. Xia Y, Liu Z, Yan Y, et al. Media quality assessment by perceptual gaze-shift patterns discovery. IEEE Trans Mult, 2017, 19: 1811–1820

    Google Scholar 

  306. Zhang W, Martin R R, Liu H. A saliency dispersion measure for improving saliency-based image quality metrics. IEEE Trans Circ Syst Video Technol, 2018, 28: 1462–1466

    Google Scholar 

  307. Mittal A, Moorthy A K, Bovik A C, et al. Automatic prediction of saliency on JPEG distorted images. In: Proceedings of IEEE International Workshop on Quality of Multimedia Experience, 2011. 195–200

    Google Scholar 

  308. Winterlich A, Zlokolica V, Denny P, et al. A saliency weighted no-reference perceptual blur metric for the automotive environment. In: Proceedings of IEEE International Workshop on Quality of Multimedia Experience, 2013. 206–211

    Google Scholar 

  309. Harel J, Koch C, Perona P. Graph-based visual saliency. In: Proceedings of the 20th Annual Conference on Neural Information Processing Systems, 2006. 545–552

    Google Scholar 

  310. Itti L, Koch C, Niebur E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans Pattern Anal Machine Intell, 1998, 20: 1254–1259

    Google Scholar 

  311. Nasrinpour H R, Bruce N D. Saliency weighted quality assessment of tone-mapped images. In: Proceedings of IEEE International Conference on Image Processing, 2015. 4947–4951

    Google Scholar 

  312. Bruce N T J. Attention based on information maximization. J Vision, 2007, 7: 950–950

    Google Scholar 

  313. Kundu D, Evans B L. Visual attention guided quality assessment of tone-mapped images using scene statistics. In: Proceedings of IEEE International Conference on Image Processing, 2016. 96–100

    Google Scholar 

  314. Min X, Zhai G, Gao Z, et al. Influence of compression artifacts on visual attention. In: Proceedings of IEEE International Conference on Multimedia and Expo, 2014. 1–6

    Google Scholar 

  315. Che Z, Borji A, Zhai G, et al. How is gaze influenced by image transformations? Dataset and model. IEEE Trans Image Process, 2020, 29: 2287–2300

    Google Scholar 

  316. Che Z, Zhai G, Min X. Influence of spatial resolution on state-of-the-art saliency models. In: Proceedings of Pacific Rim Conference on Multimedia, 2015. 74–83

    Google Scholar 

  317. Coutrot A, Guyader N. How saliency, faces, and sound influence gaze in dynamic social scenes. J Vision, 2014, 14: 5

    Google Scholar 

  318. Min X, Zhai G, Gu K, et al. Fixation prediction through multimodal analysis. ACM Trans Mult Comput Commun Appl, 2017, 13: 6

    Google Scholar 

  319. Min X, Zhai G, Gao Z, et al. Sound influences visual attention discriminately in videos. In: Proceedings of IEEE International Workshop on Quality of Multimedia Experience, 2014. 153–158

    Google Scholar 

  320. Min X, Zhai G, Zhou J, et al. A multimodal saliency model for videos with high audio-visual correspondence. IEEE Trans Image Process, 2020, 29: 3805–3819

    MathSciNet  Google Scholar 

  321. Min X, Zhai G, Gu K, et al. Visual attention analysis and prediction on human faces. Inf Sci, 2017, 420: 417–430

    Google Scholar 

  322. Duan H, Min X, Fang Y, et al. Visual attention analysis and prediction on human faces for children with autism spectrum disorder. ACM Trans Mult Comput Commun Appl, 2019, 15: 90

    Google Scholar 

  323. Wang S, Jiang M, Duchesne X M, et al. Atypical visual saliency in autism spectrum disorder quantified through model-based eye tracking. Neuron, 2015, 88: 604–616

    Google Scholar 

  324. Duan H, Zhai G, Min X, et al. A dataset of eye movements for the children with autism spectrum disorder. In: Proceedings of the 10th ACM Multimedia Systems Conference, 2019. 255–260

    Google Scholar 

  325. Duan H, Zhai G, Min X, et al. Learning to predict where the children with asd look. In: Proceedings of IEEE International Conference on Image Processing, 2018. 704–708

    Google Scholar 

  326. Hou W, Gao X. Be natural: a saliency-guided deep framework for image quality. In: Proceedings of IEEE International Conference on Multimedia and Expo, 2014. 1–6

    Google Scholar 

  327. Zhang L, Shen Y, Li H. VSI: a visual saliency-induced index for perceptual image quality assessment. IEEE Trans Image Process, 2014, 23: 4270–4281

    MathSciNet  MATH  Google Scholar 

  328. Zhang L, Gu Z, Li H. SDSP: a novel saliency detection method by combining simple priors. In: Proceedings of IEEE International Conference on Image Processing, 2013. 171–175

    Google Scholar 

  329. Zhang W, Liu H. Learning picture quality from visual distraction: Psychophysical studies and computational models. Neurocomputing, 2017, 247: 183–191

    Google Scholar 

  330. Yang H, Fang Y, Lin W. Perceptual quality assessment of screen content images. IEEE Trans Image Process, 2015, 24: 4408–4421

    MathSciNet  MATH  Google Scholar 

  331. Gu K, Wang S, Zhai G, et al. Screen image quality assessment incorporating structural degradation measurement. In: Proceedings of IEEE International Symposium on Circuits and Systems, 2015. 125–128

    Google Scholar 

  332. Wang S, Gu K, Zeng K, et al. Perceptual screen content image quality assessment and compression. In: Proceedings of IEEE International Conference on Image Processing, 2015. 1434–1438

    Google Scholar 

  333. Wang S, Gu K, Zeng K, et al. Objective quality assessment and perceptual compression of screen content images. IEEE Comput Grap Appl, 2018, 38: 47–58

    Google Scholar 

  334. Ni Z, Ma L, Zeng H, et al. Gradient direction for screen content image quality assessment. IEEE Signal Process Lett, 2016, 23: 1394–1398

    Google Scholar 

  335. Ni Z, Ma L, Zeng H, et al. Screen content image quality assessment using edge model. In: Proceedings of IEEE International Conference on Image Processing, 2016. 81–85

    Google Scholar 

  336. Ni Z, Zeng H, Ma L, et al. A Gabor feature-based quality assessment model for the screen content images. IEEE Trans Image Process, 2018, 27: 4516–4528

    MathSciNet  MATH  Google Scholar 

  337. Fu Y, Zeng H, Ma L, et al. Screen content image quality assessment using multi-scale difference of Gaussian. IEEE Trans Circ Syst Video Technol, 2018, 28: 2428–2432

    Google Scholar 

  338. Gu K, Wang S, Yang H, et al. Saliency-guided quality assessment of screen content images. IEEE Trans Mult, 2016, 18: 1098–1110

    Google Scholar 

  339. Gu K, Qiao J, Min X, et al. Evaluating quality of screen content images via structural variation analysis. IEEE Trans Visual Comput Graph, 2017, 24: 2689–2701

    Google Scholar 

  340. Fang Y, Yan J, Liu J, et al. Objective quality assessment of screen content images by uncertainty weighting. IEEE Trans Image Process, 2017, 26: 2016–2027

    MathSciNet  MATH  Google Scholar 

  341. Zhang Y, Chandler D M, Mou X. Quality assessment of screen content images via convolutional-neural-network-based synthetic/natural segmentation. IEEE Trans Image Process, 2018, 27: 5113–5128

    MathSciNet  MATH  Google Scholar 

  342. Wang S, Gu K, Zhang X, et al. Subjective and objective quality assessment of compressed screen content images. IEEE J Emerg Sel Top Circ Syst, 2016, 6: 532–543

    Google Scholar 

  343. Wang S, Gu K, Zhang X, et al. Reduced-reference quality assessment of screen content images. IEEE Trans Circ Syst Video Tech, 2018, 28: 1–14

    Google Scholar 

  344. Jakhetiya V, Gu K, Lin W, et al. A prediction backed model for quality assessment of screen content and 3-D synthesized images. IEEE Trans Ind Inf, 2018, 14: 652–660

    Google Scholar 

  345. Gu K, Zhai G, Lin W, et al. Learning a blind quality evaluation engine of screen content images. Neurocomputing, 2016, 196: 140–149

    Google Scholar 

  346. Zuo L, Wang H, Fu J. Screen content image quality assessment via convolutional neural network. In: Proceedings of IEEE International Conference on Image Processing, 2016. 2082–2086

    Google Scholar 

  347. Gu K, Zhou J, Qiao J F, et al. No-reference quality assessment of screen content pictures. IEEE Trans Image Process, 2017, 26: 4005–4018

    MathSciNet  MATH  Google Scholar 

  348. Shao F, Gao Y, Li F, et al. Toward a blind quality predictor for screen content images. IEEE Trans Syst Man Cyber Syst, 2018, 48: 1521–1530

    Google Scholar 

  349. Fang Y, Yan J, Li L, et al. No reference quality assessment for screen content images with both local and global feature representation. IEEE Trans Image Process, 2018, 27: 1600–1610

    MathSciNet  MATH  Google Scholar 

  350. Zhou W, Yu L, Zhou Y, et al. Local and global feature learning for blind quality evaluation of screen content and natural scene images. IEEE Trans Image Process, 2018, 27: 2086–2095

    MathSciNet  MATH  Google Scholar 

  351. Ma K, Yeganeh H, Zeng K, et al. High dynamic range image compression by optimizing tone mapped image quality index. IEEE Trans Image Process, 2015, 24: 3086–3097

    MathSciNet  MATH  Google Scholar 

  352. Gu K,Wang S, Zhai G, et al. Blind quality assessment of tone-mapped images via analysis of information, naturalness, and structure. IEEE Trans Mult, 2016, 18: 432–443

    Google Scholar 

  353. Gu K, Zhai G, Liu M, et al. Details preservation inspired blind quality metric of tone mapping methods. In: Proceedings of IEEE International Symposium on Circuits and Systems, 2014. 518–521

    Google Scholar 

  354. Nafchi H Z, Shahkolaei A, Moghaddam R F, et al. FSITM: a feature similarity index for tone-mapped images. IEEE Signal Process Lett, 2015, 22: 1026–1029

    Google Scholar 

  355. Kundu D, Ghadiyaram D, Bovik A C, et al. No-reference quality assessment of tone-mapped HDR pictures. IEEE Trans Image Process, 2017, 26: 2957–2971

    MathSciNet  MATH  Google Scholar 

  356. Hadizadeh H, Bajic I V. Full-reference objective quality assessment of tone-mapped images. IEEE Trans Mult, 2018, 20: 392–404

    Google Scholar 

  357. Yue G, Hou C, Gu K, et al. Biologically inspired blind quality assessment of tone-mapped images. IEEE Trans Indust Electron, 2018, 65: 2525–2536

    Google Scholar 

  358. Yue G, Hou C, Zhou T. Blind quality assessment of tone-mapped images considering colorfulness, naturalness and structure. IEEE Trans Indust Electron, 2019, 66: 3784–3793

    Google Scholar 

  359. Xydeas C S, Petrovic V S. Objective pixel-level image fusion performance measure. In: Proceedings of International Society for Optics and Photonics, 2000. 89–98

    Google Scholar 

  360. Qu G, Zhang D, Yan P. Information measure for performance of image fusion. Electron Lett, 2002, 38: 313–315

    Google Scholar 

  361. Piella G, Heijmans H. A new quality metric for image fusion. In: Proceedings of IEEE International Conference on Image Processing, 2003. 173

    Google Scholar 

  362. Cvejic N, Canagarajah C N, Bull D R. Image fusion metric based on mutual information and Tsallis entropy. Electron Lett, 2006, 42: 626–627

    Google Scholar 

  363. Chen H, Varshney P K. A human perception inspired quality metric for image fusion based on regional information. Inf Fusion, 2007, 8: 193–207

    Google Scholar 

  364. Zheng Y, Essock E A, Hansen B C, et al. A new metric based on extended spatial frequency and its application to DWT based fusion algorithms. Inf Fusion, 2007, 8: 177–192

    Google Scholar 

  365. Wang P W, Liu B. A novel image fusion metric based on multi-scale analysis. In: Proceedings of IEEE International Conference on Signal Processing, 2008. 965–968

    Google Scholar 

  366. Hossny M, Nahavandi S, Creighton D. Comments on ‘Information measure for performance of image fusion’. Electron Lett, 2008, 44: 1066–1067

    Google Scholar 

  367. Chen Y, Blum R S. A new automated quality assessment algorithm for image fusion. Image Vision Comput, 2009, 27: 1421–1432

    Google Scholar 

  368. Hassen R, Wang Z, Salama M M A. Objective quality assessment for multiexposure multifocus image fusion. IEEE Trans Image Process, 2015, 24: 2712–2724

    MathSciNet  MATH  Google Scholar 

  369. Karimi M, Samavi S, Karimi N, et al. Quality assessment of retargeted images by salient region deformity analysis. J Visual Commun Image Represent, 2017, 43: 108–118

    Google Scholar 

  370. Ma L, Xu L, Zhang Y, et al. No-reference retargeted image quality assessment based on pairwise rank learning. IEEE Trans Mult, 2016, 18: 2228–2237

    Google Scholar 

  371. Zhang Y, Fang Y, Lin W, et al. Backward registration-based aspect ratio similarity for image retargeting quality assessment. IEEE Trans Image Process, 2016, 25: 4286–4297

    MathSciNet  MATH  Google Scholar 

  372. Fang Y, Zeng K, Wang Z, et al. Objective quality assessment for image retargeting based on structural similarity. IEEE J Emerg Sel Top Circ Syst, 2014, 4: 95–105

    Google Scholar 

  373. Hsu C C, Lin C W, Fang Y, et al. Objective quality assessment for image retargeting based on perceptual geometric distortion and information loss. IEEE J Sel Top Signal Process, 2014, 8: 377–389

    Google Scholar 

  374. Chen Z, Lin J, Liao N, et al. Full reference quality assessment for image retargeting based on natural scene statistics modeling and bi-directional saliency similarity. IEEE Trans Image Process, 2017, 26: 5138–5148

    MathSciNet  MATH  Google Scholar 

  375. Zhang Y, Lin W, Li Q, et al. Multiple-level feature-based measure for retargeted image quality. IEEE Trans Image Process, 2018, 27: 451–463

    MathSciNet  MATH  Google Scholar 

  376. Zhang Y, Ngan K N, Ma L, et al. Objective quality assessment of image retargeting by incorporating fidelity measures and inconsistency detection. IEEE Trans Image Process, 2017, 26: 5980–5993

    MathSciNet  Google Scholar 

  377. Liang Y, Liu Y J, Gutierrez D. Objective quality prediction of image retargeting algorithms. IEEE Trans Visual Comput Graph, 2017, 23: 1099–1110

    Google Scholar 

  378. Zhang F, Roysam B. Blind quality metric for multidistortion images based on cartoon and texture decomposition. IEEE Signal Process Lett, 2016, 23: 1265–1269

    Google Scholar 

  379. Lu Y, Xie F, Liu T, et al. No reference quality assessment for multiply-distorted images based on an improved bag-of-words model. IEEE Signal Process Lett, 2015, 22: 1811–1815

    Google Scholar 

  380. Hadizadeh H, Bajic I V. Color Gaussian jet features for no-reference quality assessment of multiply-distorted images. IEEE Signal Process Lett, 2016, 23: 1717–1721

    Google Scholar 

  381. Li Q, Lin W, Fang Y. No-reference quality assessment for multiply-distorted images in gradient domain. IEEE Signal Process Lett, 2016, 23: 541–545

    Google Scholar 

  382. Zhang Y, Chandler D M. Opinion-unaware blind quality assessment of multiply and singly distorted images via distortion parameter estimation. IEEE Trans Image Process, 2018, 27: 5433–5448

    MathSciNet  Google Scholar 

  383. Brooks A C, Zhao X N, Pappas T N. Structural similarity quality metrics in a coding context: exploring the space of realistic distortions. IEEE Trans Image Process, 2008, 17: 1261–1273

    MathSciNet  Google Scholar 

  384. Yang L, Du H, Xu J, et al. Blind image quality assessment on authentically distorted images with perceptual features. In: Proceedings of IEEE International Conference on Image Processing, 2016. 2042–2046

    Google Scholar 

  385. Ghadiyaram D, Bovik A C. Scene statistics of authentically distorted images in perceptually relevant color spaces for blind image quality assessment. In: Proceedings of IEEE International Conference on Image Processing, 2015. 3851–3855

    Google Scholar 

  386. Ghadiyaram D, Bovik A C. Perceptual quality prediction on authentically distorted images using a bag of features approach. J Vision, 2017, 17: 32–32

    Google Scholar 

  387. Liu Y, Gu K, Wang S, et al. Blind quality assessment of camera images based on low-level and high-level statistical features. IEEE Trans Mult, 2019, 21: 135–146

    Google Scholar 

  388. Sinno Z, Bovik A C. Large-scale study of perceptual video quality. IEEE Trans Image Process, 2018, 28: 612–627

    MathSciNet  MATH  Google Scholar 

  389. Battisti F, Bosc E, Carli M, et al. Objective image quality assessment of 3D synthesized views. Signal Process Image Commun, 2015, 30: 78–88

    Google Scholar 

  390. Sandić-Stanković D, Kukolj D, Le Callet P. Multi-scale synthesized view assessment based on morphological pyramids. J Electr Eng, 2016, 67: 3–11

    Google Scholar 

  391. Sandić-Stanković D, Kukolj D, Callet P. DIBR-synthesized image quality assessment based on morphological multiscale approach. EURASIP J Image Video Process, 2016, 2017: 4

    MATH  Google Scholar 

  392. Li L, Zhou Y, Gu K, et al. Quality assessment of DIBR-synthesized images by measuring local geometric distortions and global sharpness. IEEE Trans Mult, 2018, 20: 914–926

    Google Scholar 

  393. Gu K, Jakhetiya V, Qiao J F, et al. Model-based referenceless quality metric of 3D synthesized images using local image description. IEEE Trans Image Process, 2018, 27: 394–405

    MathSciNet  MATH  Google Scholar 

  394. Tian S, Zhang L, Morin L, et al. NIQSV+: a no-reference synthesized view quality assessment metric. IEEE Trans Image Process, 2018, 27: 1652–1664

    MathSciNet  Google Scholar 

  395. Zhou Y, Li L, Wang S, et al. No-reference quality assessment of DIBR-synthesized videos by measuring temporal flickering. J Visual Commun Image Represent, 2018, 55: 30–39

    Google Scholar 

  396. Ling S, Li J, Che Z, et al. Quality assessment of free-viewpoint videos by quantifying the elastic changes of multi-scale motion trajectories. 2019. ArXiv: 190312107

    Google Scholar 

  397. Li B, Ren W, Fu D, et al. Benchmarking single-image dehazing and beyond. IEEE Trans Image Process, 2018, 28: 492–505

    MathSciNet  MATH  Google Scholar 

  398. Hautiere N, Tarel J P, Aubert D, et al. Blind contrast enhancement assessment by gradient ratioing at visible edges. Image Anal Stereol, 2008, 27: 87–95

    MathSciNet  MATH  Google Scholar 

  399. Duan H, Zhai G, Yang X, et al. IVQAD 2017: an immersive video quality assessment database. In: Proceedings of International Conference on Systems, Signals and Image Processing, 2017. 1–5

    Google Scholar 

  400. Xu M, Li C, Chen Z, et al. Assessing visual quality of omnidirectional videos. IEEE Trans Circ Syst Video Technol, 2019, 29: 3516–3530

    Google Scholar 

  401. Duan H, Zhai G, Min X, et al. Assessment of visually induced motion sickness in immersive videos. In: Advances in Multimedia Information Processing—PCM 2017. Berlin: Springer, 2017. 662–672

    Google Scholar 

  402. Yu M, Lakshman H, Girod B. A framework to evaluate omnidirectional video coding schemes. In: Proceedings of IEEE International Symposium on Mixed and Augmented Reality, 2015. 31–36

    Google Scholar 

  403. Yule S, Lu A, Lu Y. WS-PSNR for 360 video objective quality evaluation. MPEG Joint Video Exploration Team, 2016. 116

    Google Scholar 

  404. Zakharchenko V, Choi K P, Park J H. Quality metric for spherical panoramic video. In: Proceedings of Optics and Photonics for Information Processing X. 2016. 99700C

    Google Scholar 

  405. Huang W, Ding L, Zhai G, et al. Utility-oriented resource allocation for 360-degree video transmission over heterogeneous networks. Digital Signal Process, 2019, 84: 1–14

    Google Scholar 

  406. Gutiérrez J, David E, Rai Y, et al. Toolbox and dataset for the development of saliency and scanpath models for omnidirectional/360° still images. Signal Process Image Commun, 2018, 69: 35–42

    Google Scholar 

  407. Zhu Y, Zhai G, Min X, et al. The prediction of saliency map for head and eye movements in 360 degree images. IEEE Trans Mult, 2019. doi: 10.1109/TMM.2019.2957986

    Google Scholar 

  408. Zhu Y, Zhai G, Min X. The prediction of head and eye movement for 360 degree images. Signal Process Image Commun, 2018, 69: 15–25

    Google Scholar 

  409. Fang Y, Zhang X, Imamoglu N. A novel superpixel-based saliency detection model for 360-degree images. Signal Process Image Commun, 2018, 69: 1–7

    Google Scholar 

  410. Ling J, Zhang K, Zhang Y, et al. A saliency prediction model on 360 degree images using color dictionary based sparse representation. Signal Process Image Commun, 2018, 69: 60–68

    Google Scholar 

  411. Monroy R, Lutz S, Chalasani T, et al. SalNet360: saliency maps for omni-directional images with CNN. Signal Process Image Commun, 2018, 69: 26–34

    Google Scholar 

  412. Lebreton P, Raake A. GBVS360, BMS360, ProSal: extending existing saliency prediction models from 2D to omnidirectional images. Signal Process Image Commun, 2018, 69: 69–78

    Google Scholar 

  413. Li C, Xu M, Zhang S, et al. State-of-the-art in 360° video/image processing: perception, assessment and compression. 2019. ArXiv: 190500161

    Google Scholar 

  414. Chikkerur S, Sundaram V, Reisslein M, et al. Objective video quality assessment methods: a classification, review, and performance comparison. IEEE Trans Broadcast, 2011, 57: 165–182

    Google Scholar 

  415. Seshadrinathan K, Soundararajan R, Bovik A C, et al. Study of subjective and objective quality assessment of video. IEEE Trans Image Process, 2010, 19: 1427–1441

    MathSciNet  MATH  Google Scholar 

  416. Pinson M H, Wolf S. A new standardized method for objectively measuring video quality. IEEE Trans Brocast, 2004, 50: 312–322

    Google Scholar 

  417. Zhai G T, Cai J F, Lin W S, et al. Cross-dimensional perceptual quality assessment for low bit-rate videos. IEEE Trans Mult, 2008, 10: 1316–1324

    Google Scholar 

  418. Zhai G T, Cai J F, Lin W S, et al. Three dimensional scalable video adaptation via user-end perceptual quality assessment. IEEE Trans Broadcast, 2008, 54: 719–727

    Google Scholar 

  419. Zhao T, Liu Q, Chen C W. QoE in video transmission: a user experience-driven strategy. IEEE Commun Surv Tutor, 2016, 19: 285–302

    Google Scholar 

  420. Bampis C G, Li Z, Moorthy A K, et al. Study of temporal effects on subjective video quality of experience. IEEE Trans Image Process, 2017, 26: 5217–5231

    MathSciNet  MATH  Google Scholar 

  421. Bampis C G, Li Z, Katsavounidis I, et al. Towards perceptually optimized end-to-end adaptive video streaming. 2018. ArXiv: 180803898

    Google Scholar 

  422. Ghadiyaram D, Pan J, Bovik A C. A subjective and objective study of stalling events in mobile streaming videos. IEEE Trans Circ Syst Video Technol, 2017, 29: 183–197

    Google Scholar 

  423. Duanmu Z, Ma K, Wang Z. Quality-of-experience for adaptive streaming videos: an expectation confirmation theory motivated approach. IEEE Trans Image Process, 2018, 27: 6135–6146

    MathSciNet  Google Scholar 

  424. Duanmu Z, Rehman A, Wang Z. A quality-of-experience database for adaptive video streaming. IEEE Trans Broadcast, 2018, 64: 474–487

    Google Scholar 

  425. Duanmu Z, Zeng K, Ma K, et al. A quality-of-experience index for streaming video. IEEE J Sel Top Signal Process, 2016, 11: 154–166

    Google Scholar 

  426. Pinson M H, Janowski L, Pepion R, et al. The influence of subjects and environment on audiovisual subjective tests: an international study. IEEE J Sel Top Signal Process, 2012, 6: 640–651

    Google Scholar 

  427. Akhtar Z, Falk T H. Audio-visual multimedia quality assessment: a comprehensive survey. IEEE Access, 2017, 5: 21090–21117

    Google Scholar 

  428. Yu X, Bampis C G, Gupta P, et al. Predicting the quality of images compressed after distortion in two steps. IEEE Trans Image Process, 2019, 28: 5757–5770

    MathSciNet  MATH  Google Scholar 

  429. Gu K, Zhai G, Lin W, et al. The analysis of image contrast: from quality assessment to automatic enhancement. IEEE Trans Cyber, 2015, 46: 284–297

    Google Scholar 

  430. Fang Y, Ma K, Wang Z, et al. No-reference quality assessment of contrast-distorted images based on natural scene statistics. IEEE Signal Process Lett, 2014, 22: 838–842

    Google Scholar 

  431. Gu K, Lin W, Zhai G, et al. No-reference quality metric of contrast-distorted images based on information maximization. IEEE Trans Cyber, 2016, 47: 4559–4565

    Google Scholar 

  432. Liu M, Gu K, Zhai G, et al. Perceptual reduced-reference visual quality assessment for contrast alteration. IEEE Trans Broadcast, 2017, 63: 71–81

    Google Scholar 

  433. Krasula L, Le Callet P, Fliegel K, et al. Quality assessment of sharpened images: challenges, methodology, and objective metrics. IEEE Trans Image Process, 2017, 26: 1496–1508

    Google Scholar 

  434. Deng Y, Loy C C, Tang X. Image aesthetic assessment: an experimental survey. IEEE Signal Process Mag, 2017, 34: 80–106

    Google Scholar 

  435. Wu L, Jin X, Zhao G, et al. Two open-source projects for image aesthetic quality assessment. Sci China Inf Sci, 2019, 62: 027101

    Google Scholar 

  436. Guo G J, Wang H Z, Yan Y, et al. Large margin deep embedding for aesthetic image classification. Sci China Inf Sci, 2020, 63: 119101

    Google Scholar 

  437. Autrusseau F, Stutz T, Pankajakshan V. Subjective quality assessment of selective encryption techniques. 2010. Subjective database. http://ivc.univ-nantes.fr/en/databases/Selective_Encryption/

    Google Scholar 

  438. Yue G, Hou C, Gu K, et al. No-reference quality evaluator of transparently encrypted images. IEEE Trans Mult, 2019, 21: 2184–2194

    Google Scholar 

  439. Adhikarla V K, Vinkler M, Sumin D, et al. Towards a quality metric for dense light fields. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, 2017. 58–67

    Google Scholar 

  440. Min X, Zhou J, Zhai G, et al. A metric for light field reconstruction, compression, and display quality evaluation. IEEE Trans Image Process, 2020, 29: 3790–3804

    Google Scholar 

  441. Viola I, Ebrahimi T. Valid: visual quality assessment for light field images dataset. In: Proceedings of IEEE International Conference on Quality of Multimedia Experience, 2018. 1–3

    Google Scholar 

  442. Gupta P, Sinno Z, Glover J L, et al. Predicting detection performance on security X-ray images as a function of image quality. IEEE Trans Image Process, 2019, 28: 3328–3342

    MathSciNet  MATH  Google Scholar 

  443. Hu M, Zhai G, Xie R, et al. A wavelet-predominant algorithm can evaluate quality of THz security image and identify its usability. IEEE Trans Broadcast, 2019. doi: 10.1109/TBC.2019.2901388

    Google Scholar 

  444. Hu M, Min X, Zhu W, et al. Terahertz security image quality assessment by no-reference model observers. In: Proceedings of International Forum on Digital TV and Wireless Multimedia Communications, 2017. 100–114

    Google Scholar 

  445. Chen W, Gu K, Min X, et al. Partial-reference sonar image quality assessment for underwater transmission. IEEE Trans Aerosp Electron Syst, 2018, 54: 2776–2787

    Google Scholar 

  446. Chen W, Gu K, Lin W, et al. Reference-free quality assessment of sonar images via contour degradation measurement. IEEE Trans Image Process, 2019, 28: 5336–5351

    MathSciNet  MATH  Google Scholar 

  447. Yan Z, Liu Q, Zhang T, et al. Exploring QoE for power efficiency: a field study on mobile videos with lcd displays. In: Proceedings of ACM International Conference on Multimedia, 2015. 431–440

    Google Scholar 

  448. Sun W, Zhai G, Min X, et al. Dynamic backlight scaling considering ambient luminance for mobile energy saving. In: Proceedings of IEEE International Conference on Multimedia and Expo, 2017. 25–30

    Google Scholar 

  449. Video Quality Experts Group (VQEG). Final Report From the Video Quality Experts Group on the Validation of Objective Models of Video Quality Assessment. https://www.its.bldrdoc.gov/vqeg/projects/frtv-phase-ii/frtv-phase-ii.aspx

  450. Tanchenko A. Visual-PSNR measure of image quality. J Visual Commun Image Represent, 2014, 25: 874–878

    Google Scholar 

  451. Chang H, Zhang Q, Wu Q, et al. Perceptual image quality assessment by independent feature detector. Neurocomputing, 2015, 151: 1142–1152

    Google Scholar 

  452. Wang S, Deng C, Lin W, et al. NMF-based image quality assessment using extreme learning machine. IEEE Trans Cybern, 2017, 47: 232–243

    Google Scholar 

  453. Liu D, Xu Y, Quan Y, et al. Reduced reference image quality assessment using regularity of phase congruency. Signal Process Image Commun, 2014, 29: 844–855

    Google Scholar 

  454. Zhou W, Zhang S, Pan T, et al. Blind 3D image quality assessment based on self-similarity of binocular features. Neurocomputing, 2016, 224: 128–134

    Google Scholar 

Download references

Acknowledgements

This work was supported by National Natural Science Foundation of China (Grant Nos. 61901260, 61831015, 61521062, 61527804).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guangtao Zhai.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhai, G., Min, X. Perceptual image quality assessment: a survey. Sci. China Inf. Sci. 63, 211301 (2020). https://doi.org/10.1007/s11432-019-2757-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11432-019-2757-1

Keywords

Navigation