An improved model for no-reference image quality assessment and a no-reference video quality assessment model based on frame analysis

  • Mukesh Kumar Rohil
  • Neetika GuptaEmail author
  • Prakash Yadav
Original Paper


No-reference image quality assessment (NR-IQA) uses only the test image for its quality assessment, and as video is essentially comprised of image frames with additional temporal dimension, video quality assessment (VQA) requires a thorough understanding of image quality assessment metrics and models. Therefore, in order to identify features that deteriorate video quality, a fundamental analysis of spatial and temporal artifacts with respect to individual video frames needs to be performed. Existing IQA and VQA metrics are primarily for capturing few distortions and hence may not be good for all types of images and videos. In this paper, we propose an NR-IQA model by combining existing three methods (namely NIQE, BRISQUE and BLIINDS-II) using multi-linear regression. We also present a holistic no-reference video quality assessment (NR-VQA) model by exploring quantification of certain distortions like ringing, frame difference, blocking, clipping and contrast in video frames. For the proposed NR-IQA model, the results represent improved performance as compared to the state-of-the-art methods and it requires very low fraction of samples for training to provide a consistent accuracy over different training-to-testing ratios. The performance of NR-VQA model is examined using a simple neural network model to attain high value of goodness of fit.


No-reference image quality assessment No-reference video quality assessment Spatial artifacts Temporal artifacts 



  1. 1.
    Lin, W., Kuo, C.J.: Perceptual visual quality metrics: a survey. J. Vis. Commun. Image Represent. 22(4), 297–312 (2011)CrossRefGoogle Scholar
  2. 2.
    Gao, X., Lu, W., Tao, D., Li, X.: Image quality assessment and human visual system. Vis. Commun. Image Process. 7744, 77440Z-1–77440Z-10 (2010)Google Scholar
  3. 3.
    Kamble, V., Bhurchandi, K.M.: No-reference image quality assessment algorithms: a survey. Opt. Int. J. Light Electron Opt. 126(11–12), 1090–1097 (2015)CrossRefGoogle Scholar
  4. 4.
    Wang, T., Zhang, L., Jia, H.: An effective general-purpose NR-IQA model using natural scene statistics (NSS) of the luminance relative order. Sig. Process. Image Commun. 71, 100–109 (2019)CrossRefGoogle Scholar
  5. 5.
    Gu, K., Zhou, J., Zhai, G., Lin, W., Bovik, A.C.: No-reference quality assessment of screen content pictures. IEEE Trans. Image Process. 26(8), 4005–4017 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Chen, M.J., Bovik, A.C.: No-reference image blur assessment using multiscale gradient. EURASIP J. Image Video Process. 1, 1–11 (2011)CrossRefGoogle Scholar
  7. 7.
    Zhu, X., Milanfar, P.: A no-reference sharpness metric sensitive to blur and noise. In: International Workshop on Quality of Multimedia Experience, pp. 64–69 (2009)Google Scholar
  8. 8.
    Sazzad, Z.M.P., Kawayoke, Y., Horita, Y.: No-reference image quality assessment for JPEG2000 based on spatial features. Sig. Process. Image Commun. 23(4), 257–268 (2008)CrossRefGoogle Scholar
  9. 9.
    Sheikh, H.R., Bovik, A.C., Cormack, L.K.: No-reference quality assessment using natural scene statistics: JPEG2000. IEEE Trans. Image Process. 14(11), 1918–1927 (2005)CrossRefGoogle Scholar
  10. 10.
    Wang, Z., Bovik, A.C., Evans, B.L.: Blind measurement of blocking artifacts in images. In: Proceedings of the IEEE International Conference on Image Processing, pp. 981–984 (2000)Google Scholar
  11. 11.
    Moorthy, A.K., Bovik, A.C.: A two-step framework for constructing blind image quality indices. IEEE Signal Process. Lett. 17(5), 513–516 (2010)CrossRefGoogle Scholar
  12. 12.
    Moorthy, A.K., Bovik, A.C.: Blind image quality assessment: from natural scene statistics to perceptual quality. IEEE Trans. Image Process. 20(12), 3350–3364 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Saad, M., Bovik, A.C., Charrier, C.: Blind image quality assessment: a natural scene statistics approach in the DCT domain. IEEE Trans. Image Process. 21(8), 3339–3352 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Mittal, A., Moorthy, A.K., Bovik, A.C.: No-reference image quality assessment in the spatial domain. IEEE Trans. Image Process. 21(12), 4695–4708 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Mittal, A., Soundararajan, R., Bovik, A.C.: Making a completely blind image quality analyzer. IEEE Signal Process. Lett. 22(3), 209–212 (2013)CrossRefGoogle Scholar
  16. 16.
    Bosse, S., Maniry, D., Muller, K.R., Wiegand, T., Samek, W.: Deep neural networks for no-reference and full-reference image quality assessment. IEEE Trans. Image Process. 27(1), 206–219 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Bianco, S., Celona, L., Napoletano, P., Schettini, R.: On the use of deep learning for blind image quality assessment. SIViP 12(2), 355–362 (2018)CrossRefGoogle Scholar
  18. 18.
    Suthaharan, S.: Perceptual quality metric for digital video coding. IET Electron. Lett. 39(5), 431–433 (2003)CrossRefGoogle Scholar
  19. 19.
    Muijs, R., Kirenko, I.: A no-reference blocking artifact measure for adaptive video processing. In: Proceedings of European Signal Processing Conference, pp. 1–4 (2005)Google Scholar
  20. 20.
    Ou, Y.F., Ma, Z., Liu, T., Wang, Y.: Perceptual quality assessment of video considering both frame rate and quantization artifacts. IEEE Trans. Circuits Syst. Video Technol. 21(3), 286–298 (2011)CrossRefGoogle Scholar
  21. 21.
    Ong, E.P., Wu, S., Loke, M.H., Rahardja, S., Tay, J., Tan, C.K., Huang, L.: Video quality monitoring of streamed videos. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 1153–1156 (2009)Google Scholar
  22. 22.
    Keimel, C., Oelbaum, T., Diepold, K.: No-reference video quality evaluation for high-definition video. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 1145–1148 (2009)Google Scholar
  23. 23.
    Saad, M.A., Bovik, A.C.: Blind quality assessment of videos using a model of natural scene statistics and motion coherency. In: IEEE Conference Record of the 46th Asilomar Conference on Signals, Systems and Computers, pp. 332–336 (2012)Google Scholar
  24. 24.
    Li, X., Guo, Q., Lu, X.: Spatiotemporal statistics for video quality assessment. IEEE Trans. Image Process. 25(7), 3329–3342 (2016)MathSciNetCrossRefzbMATHGoogle Scholar
  25. 25.
    Zhang, Y., Gao, X., He, L., Lu, W., He, R.: Objective video quality assessment combining transfer learning with CNN. IEEE Trans. Neural Netw. Learn. Syst. (2019). Google Scholar
  26. 26.
    Wang, Z., Bovik, A.C., Sheikh, H.R., Simoncelli, E.P.: Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13(4), 600–612 (2004)CrossRefGoogle Scholar
  27. 27.
    Wu, H.R., Yuen, M.: Generalized block-edge impairment metric (GBIM) for video coding. IEEE Signal Process. Lett. 4(11), 317–320 (1997)CrossRefGoogle Scholar
  28. 28.
    Turkowski, K.: Anti-aliasing through the use of coordinate transformations. ACM Trans. Graph. 1(3), 215–234 (1982)CrossRefGoogle Scholar
  29. 29.
    Farrell, J.E., Benson, B.L., Haynie, C.R.: Predicting flicker thresholds for video display terminals. Proc. SID 28(4), 449–453 (1987)Google Scholar
  30. 30.
    Demuth, H., Beale, M.: Matlab Neural Network Toolbox User’s Guide Version 6. The MathWorks Inc., Natick (2009)Google Scholar
  31. 31.
    Sheikh, H.R., Sabir, M.F., Bovik, A.C.: A statistical evaluation of recent full reference image quality assessment algorithms. IEEE Trans. Image Process. 15(11), 3440–3451 (2006)CrossRefGoogle Scholar
  32. 32.
    Seshadrinathan, K., Soundararajan, R., Bovik, A.C., Cormack, L.K.: Study of subjective and objective quality assessment of video. IEEE Trans. Image Process. 19(6), 1427–1441 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  33. 33.
    Moré, J.J.: The Levenberg–Marquardt algorithm: implementation and theory. In: Watson, G.A. (eds.) Numerical Analysis. Lecture Notes in Mathematics, vol. 630, pp. 105–116. Springer, Berlin (1978)Google Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Computer Science and Information Systems, Pilani CampusBirla Institute of Technology and SciencePilaniIndia

Personalised recommendations