Advertisement

Subjective and Objective Visual Quality Assessment in the Context of Stereoscopic 3D-TV

  • Marcus Barkowsky
  • Kjell Brunnström
  • Touradj Ebrahimi
  • Lina Karam
  • Pierre Lebreton
  • Patrick Le Callet
  • Andrew Perkis
  • Alexander Raake
  • Mahesh Subedar
  • Kun Wang
  • Liyuan Xing
  • Junyong You
Chapter

Abstract

Subjective and objective visual quality assessment in the context of stereoscopic three-dimensional TV (3D-TV) is still in the nascent stage and needs to consider the effect of the added depth dimension. As a matter of fact, quality assessment of 3D-TV cannot be considered as a trivial extension of two-dimensional (2D) cases. Furthermore, it may also introduce negative effects not experienced in 2D, e.g., discomfort or nausea. Based on efforts initiated within the cost action ICT 1003 QUALINET, this chapter discusses current challenges in relation to subjective and objective visual quality assessment for stereo-based 3D-TV. Two case studies are presented to illustrate the current state of the art and some of the remaining challenges.

Keywords

3D-TV ACR Correlation Crosstalk perception Coding error Depth map Human visual system IPTV Mean opinion score Objective visual quality Quality assessment Quality of experience (QoE) Reliability Stereoscopic 3D-TV Subjective visual quality SSIM 

Notes

Acknowledgments

This work was partially supported by the COST IC1003 European Network on QoE in Multimedia Systems and Services—QUALINET (http://www.qualinet.eu/).

References

  1. 1.
    Chikkerur S, Vijay S, Reisslein M, Karam LJ (2011) Objective video quality assessment methods: a classification, review, and performance comparison. IEEE Trans Broadcast 57(2):165–182CrossRefGoogle Scholar
  2. 2.
    You J, Reiter U, Hannuksela MM, Gabbouj M, Perkis A (2010) Perceptual-based objective quality metrics for audio-visual services—a survey. Signal Process Image Commun 25(7):482–501CrossRefGoogle Scholar
  3. 3.
    Huynh-Thu Q, Le Callet P, Barkowsky M (2010) Video quality assessment: from 2D to 3D—challenges and future trends. In: IEEE ICIP, pp. 4025–4028, Sept 2010Google Scholar
  4. 4.
    Mendiburu B (2009) 3D movie making: stereoscopic digital cinema from script to screen. Focal Press, BurlingtonGoogle Scholar
  5. 5.
    Zilly F, Muller M, Eisert P, Kauff P (2010) The stereoscopic analyzer-an image-based assistance tool for stereo shooting and 3D production. In: IEEE ICIP, pp 4029–4032, Sept 2010Google Scholar
  6. 6.
    Pastoor S (1991) 3D-television: a survey of recent research results on subjective requirements. Signal Process Image Commun 4(1):21–32CrossRefGoogle Scholar
  7. 7.
    Vetro A, Wiegand T, Sullivan G (2011) Overview of the stereo and multiview video coding extensions of the H.264/MPEG-4 AVC standard. Proc IEEE 99:626–642CrossRefGoogle Scholar
  8. 8.
    Yamagishi K, Karam L, Okamoto J, Hayashi T (2011) Subjective characteristics for stereoscopic high definition video. In: IEEE QoMEX, Sept 2011Google Scholar
  9. 9.
    Kima D, Mina D, Ohab J, Jeonb S, Sohna K (2009) Depth map quality metric for three-dimensional video. Image, 5(6):7Google Scholar
  10. 10.
    Campisi P, Le Callet P, Marini E (2007) Stereoscopic images quality assessment. In: Proceedings of 15th European signal processing conference (EUSIPCO)Google Scholar
  11. 11.
    Benoit A, Le Callet P, Campisi P, Cousseau R (2008) Quality assessment of stereoscopic images. EURASIP J Image Video Process 2008:13CrossRefGoogle Scholar
  12. 12.
    Gorley P, Holliman N (2008) Stereoscopic image quality metrics and compression. In: Proceedings of SPIE, vol 6803Google Scholar
  13. 13.
    Lu F, Wang H, Ji X, Er G (2009) Quality assessment of 3D asymmetric view coding using spatial frequency dominance model. In: IEEE 3DTV conference, pp 1–4Google Scholar
  14. 14.
    Sazzad Z, Yamanaka S, Kawayokeita Y, Horita Y (2009) Stereoscopic image quality prediction. In: IEEE QoMEX, pp 180–185Google Scholar
  15. 15.
    Sazzad Z, Yamanaka S, Horita Y (2010) Spatio-temporal segmentation based continuous no-reference stereoscopic video quality prediction. In: International workshop on quality of multimedia experience, pp 106–111Google Scholar
  16. 16.
    Ekmekcioglu E, Worrall S, De Silva D, Fernando W, Kondoz A (2010) Depth based perceptual quality assesment for synthesized camera viewpoints. In: Second international conference on user centric media, Sept 2010Google Scholar
  17. 17.
    Boev A, Hollosi D, Gotchev A, Egiazarian K (2009) Classification and simulation of stereoscopic artifacts in mobile 3DTV content. In: Proceedings of SPIE, pp 72371F-12Google Scholar
  18. 18.
    Woods A, Docherty T, Koch R (1993) Image distortions in stereoscopic video systems. In: Proceedings of SPIE, San Jose, pp 36–48Google Scholar
  19. 19.
    Meesters LMJ, Ijsselsteijn WA, Seuntiëns PJH (2004) A survey of perceptual evaluations and requirements of three-dimensional TV. IEEE Trans Circuits Syst Video Technol 14(3):381–391CrossRefGoogle Scholar
  20. 20.
    Seuntiens PJH, Meesters LMJ, IJsselsteijn WA (2005) Perceptual attributes of crosstalk in 3D images. Displays 26(4-5):177–183CrossRefGoogle Scholar
  21. 21.
    Pastoor S (1995) Human factors of 3D images: results of recent research at Heinrich-Hertz-Institut Berlin. In: International display workshop, vol 3, pp 69–72Google Scholar
  22. 22.
    Lipton L (1987) Factors affecting ‘ghosting’ in time-multiplexed plano-stereoscopic CRT display systems. In: True 3D imaging techniques and display technologies, vol 761, pp 75–78Google Scholar
  23. 23.
    Huang KC, Yuan JC, Tsai CH et al (2003) A study of how crosstalk affects stereopsis in stereoscopic displays. In: Stereoscopic displays and virtual reality systems X, vol 5006, pp 247–253Google Scholar
  24. 24.
    Kooi FL, Toet A (2004) Visual comfort of binocular and 3D displays. Displays 25:99–108CrossRefGoogle Scholar
  25. 25.
    Woods A (2011) How are crosstalk and ghosting defined in the stereoscopic literature. In: Proceedings of SPIE, vol 7863, p 78630ZGoogle Scholar
  26. 26.
    ISO/IEC JTC1/SC29/WG11, M15377, M15378, M15413, M15419, Archamps, France, 2008Google Scholar
  27. 27.
    Boev A, Hollosi D, Gotchev A Software for simulation of artefacts and database of impaired videos. Mobile3DTV Project report, no. 216503, [available] http://mobile3dtv.eu
  28. 28.
    Xing L, You J, Ebrahimi T, Perkis A (2011) Assessment of stereoscopic crosstalk perception. IEEE Transaction on Multimedia, no 99, p 1Google Scholar
  29. 29.
    Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process 13(4):600–612CrossRefGoogle Scholar
  30. 30.
    Tanimoto M, Fujii T, Suzuki K et al (2008) Reference softwares for depth estimation and view synthesis. In: ISO/IEC JTC1/SC29/WG11 MPEG2008/M15377, Archamps, FranceGoogle Scholar
  31. 31.
    Barkowsky M, Wang K, Cousseau R, Brunnström K, Olsson R, Le Callet P (2010) Subjective quality assessment of error concealment strategies for 3DTV in the presence of asymmetric Transmission Errors. In: IEEE packet video workshop, pp 193–200Google Scholar
  32. 32.
    Wang K, Barkowsky M, Cousseau R, Brunnström K, Olsson R, Le Callet P et al (2011) Subjective evaluation of HDTV stereoscopic videos in IPTV scenarios using absolute category rating. In: Proceedings of SPIE, vol 7863Google Scholar
  33. 33.
    Lebreton P, Raake A, Barkowsky M, Le Callet P (2011) A subjective evaluation of 3D IPTV broadcasting implementations considering coding and transmission degradation. In: IEEE international workshop on multimedia quality of experience: modeling, evaluation and directions, pp 506–511Google Scholar
  34. 34.
    ITU-T Study Group 12 (1997) ITU-T P.910 subjective video quality assessment methods for multimedia applicationsGoogle Scholar
  35. 35.
    Zielinski S, Rumsey F, Bech S (2008) On some biases encountered in modern audio quality listening tests-a review. J AES 56(6):427–451Google Scholar
  36. 36.
    Huynh-Thu Q, Garcia M-N, Speranza F, Corriveau PJ, Raake A (2011) Study of rating scales for subjective quality assessment of high-definition video. IEEE Trans Broadcast 57(1):1–14CrossRefGoogle Scholar
  37. 37.
    Webster A, Speranza F (2008) Final report from the video quality experts group on the validation of objective models of multimedia quality assessment, phase I. http://www.its.bldrdoc.gov/vqeg/projects/multimedia/, ITU Study Group 9, TD 923
  38. 38.
    Webster A, Speranza F (2010) Video quality experts group: report on the validation of video quality models for high definition video content. http://www.its.bldrdoc.gov/vqeg/projects/hdtv/, version 2.0, 30 June 2010

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Marcus Barkowsky
    • 1
  • Kjell Brunnström
    • 2
  • Touradj Ebrahimi
    • 3
  • Lina Karam
    • 4
  • Pierre Lebreton
    • 5
  • Patrick Le Callet
    • 1
  • Andrew Perkis
    • 6
  • Alexander Raake
    • 5
  • Mahesh Subedar
    • 7
    • 4
  • Kun Wang
    • 2
    • 8
  • Liyuan Xing
    • 6
  • Junyong You
    • 6
  1. 1.LUNAM UniversitéUniversité de Nantes, IRCCyN UMR CNRS 6597Nantes Cedex 3France
  2. 2.Department of NetLabAcreo ABKistaSweden
  3. 3.Multimedia Signal Processing Group (MMSPG)Ecole Polytechnique Fédérale de Lausanne (EPFL)LausanneSwitzerland
  4. 4.School of ECEEArizona State UniversityTempeUSA
  5. 5.Assessment of IP-Based ApplicationsTelekom Innovation Laboratories, TU BerlinBerlinGermany
  6. 6.Centre for Quantifiable Quality of Service (Q2S) in Communication SystemsNorwegian University of Science and Technology (NTNU)TrondheimNorway
  7. 7.Intel Corporation and Arizona State UniversityChandlerUSA
  8. 8.Department of Information Technology and Media (ITM)Mid Sweden UniversitySundsvallSweden

Personalised recommendations