Multimedia Tools and Applications

, Volume 76, Issue 15, pp 16727–16748 | Cite as

On subjective quality assessment of adaptive video streaming via crowdsourcing and laboratory based experiments

  • Jacob Søgaard
  • Muhammad Shahid
  • Jeevan Pokhrel
  • Kjell Brunnström
Article

Abstract

Video streaming services are offered over the Internet and since the service providers do not have full control over the network conditions all the way to the end user, streaming technologies have been developed to maintain the quality of service in these varying network conditions i.e. so called adaptive video streaming. In order to cater for users’ Quality of Experience (QoE) requirements, HTTP based adaptive streaming solutions of video services have become popular. However, the keys to ensure the users a good QoE with this technology is still not completely understood. User QoE feedback is therefore instrumental in improving this understanding. Controlled laboratory based perceptual quality experiments that involve a panel of human viewers are considered to be the most valid method of the assessment of QoE. Besides laboratory based subjective experiments, crowdsourcing based subjective assessment of video quality is gaining popularity as an alternative method. This article presents insights into a study that investigates perceptual preferences of various adaptive video streaming scenarios through crowdsourcing based and laboratory based subjective assessment. The major novel contribution of this study is the application of Paired Comparison based subjective assessment in a crowdsourcing environment. The obtained results provide some novel indications, besides confirming the earlier published trends, of perceptual preferences for adaptive scenarios of video streaming. Our study suggests that in a network environment with fluctuations in the bandwidth, a medium or low video bitrate which can be kept constant is the best approach. Moreover, if there are only a few drops in bandwidth, one can choose a medium or high bitrate with a single or few buffering events.

Keywords

Adaptive video streaming Crowdsourcing Subjective quality assessment Quality of experience 

References

  1. 1.
    Blender Foundation: http://www.sintel.org
  2. 2.
    Bradley RA, Terry ME (1952) Rank analysis of incomplete block designs: I. The method of paired comparisons. Biometrika 39:324–345MathSciNetMATHGoogle Scholar
  3. 3.
    Brunnström K, Cousseau R, Jonsson J, Koudota Y, Bagazov V, Barkowsky M (2014) VQEGPlayer: open source software for subjective video quality experiments in windows. http://vqegjeg.intec.ugent.be/wiki/index.php/VQEGplayer-main
  4. 4.
    Chen KT, Chang CJ, Wu CC, Chang YC, Lei CL (2010) Quadrant of euphoria: a crowdsourcing platform for QoE assessment. IEEE Netw 24(2):28–35CrossRefGoogle Scholar
  5. 5.
    Cisco Visual Networking Index: Global mobile data traffic forecast update, 2014-2019Google Scholar
  6. 6.
    Coolican H (2014) Research methods and statistics in psychology. Psychology PressGoogle Scholar
  7. 7.
    Cranley N, Murphy L (2006) Incorporating user perception in adaptive video streaming systems. Digital Multimedia Perception and Design, pp 244–265Google Scholar
  8. 8.
    Garcia MN, De Simone F, Tavakoli S, Staelens N, Egger S, Brunnstrom K, Raake A (2014) Quality of experience and HTTP adaptive streaming: a review of subjective studies. In: 6th international workshop on quality of multimedia experience (QoMEX). IEEE, pp 141–146Google Scholar
  9. 9.
    Gardlo B, Egger S, Seufert M, Schatz R (2014) Crowdsourcing 2.0: enhancing execution speed and reliability of web-based QoE testing. In: Proceedings of the international conference on communications, pp 1070–1075Google Scholar
  10. 10.
    Grafl M, Timmerer C (2013) Video quality in next generation mobile networks - perception of time-varying transmission. In: Proceedings of the 4th international workshop on perceptual quality of systems, pp 178–183Google Scholar
  11. 11.
    Hossfeld T, Egger S, Schatz R, Fiedler M, Masuch K, Lorentzen C (2012) Initial delay vs. interruptions: between the devil and the deep blue sea. In: Proceedings of the 4th international workshop on quality of multimedia experience (QoMEX), pp 1–6Google Scholar
  12. 12.
    Hoßfeld T, Keimel C (2014) Quality of experience. Springer. Crowdsourcing in QoE evalutionGoogle Scholar
  13. 13.
    Hossfeld T, Keimel C, Hirth M, Gardlo B, Habigt J, Diepold K, Tran-Gia P (2014) Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. IEEE Trans Multimedia 16(2):541–558CrossRefGoogle Scholar
  14. 14.
    Hossfeld T, Seufert M, Sieber C, Zinner T (2014) Assessing effect sizes of influence factors towards a QoE model for HTTP adaptive streaming. In: Proceedings of the 6th international workshop on quality of multimedia experience (QoMEX), pp 111–116Google Scholar
  15. 15.
    Information technology – dynamic adaptive streaming over HTTP (DASH) – part 1: Media presentation description and segment formats. http://www.iso.org/iso/home/store/catalogue_ics/catalogue_detail_ics.htm?csnumber=65274
  16. 16.
    ITU-T Rec. P.910: Subjective video quality assessment methods for multimedia applications (2008)Google Scholar
  17. 17.
    ITU-R Rec. BT.500-13: Methodology for the subjective assessment of the quality of television pictures (2012)Google Scholar
  18. 18.
    ITU-T Rec. P.913: Methods for the subjective assessment of video quality, audio quality and audiovisual quality of internet video and distribution quality television in any environment (2014)Google Scholar
  19. 19.
    Keimel C, Habigt J, Horch C, Diepold K (2012) Qualitycrowd - a framework for crowd-based quality evaluation. In: Proceedings of picture coding symposium, pp 245–248Google Scholar
  20. 20.
    Korhonen J, Reiter U, Ukhanova A (2013) Frame rate versus spatial quality: which video characteristics do matter?. In: Proceedings of IEEE international conference on visual communication and image processing (VCIP13), pp 1–6Google Scholar
  21. 21.
    Lee JS, De Simone F, Ebrahimi T (2011) Subjective quality evaluation via paired comparison: application to scalable video coding. IEEE Trans Multimedia 13 (5):882–893CrossRefGoogle Scholar
  22. 22.
    Lee JS, De Simone F, Ramzan N, Zhao Z, Kurutepe E, Sikora T, Ostermann J, Izquierdo E, Ebrahimi T (2010) Subjective evaluation of scalable video coding for content distribution. In: Proceedings of the international conference on multimedia, pp 65–72Google Scholar
  23. 23.
    Lewcio B, Belmudez B, Mehmood A, Waltermann M, Moller S (2011) Video quality in next generation mobile networks - perception of time-varying transmission. In: Proceedings of the IEEE international workshop technical committee on communications quality and reliability (CQR), pp 1–6Google Scholar
  24. 24.
    Li J, Barkowsky M, Le Callet P (2012) Analysis and improvement of a paired comparison method in the application of 3DTV subjective experiment. In: 19th IEEE international conference on image processing (ICIP), pp 629–632Google Scholar
  25. 25.
    Li J, Barkowsky M, Le Callet P (2013) Subjective assessment methodology for preference of experience in 3DTV. In: IEEE 11th IVMSP workshopGoogle Scholar
  26. 26.
    Mok R, Chan E, Chang R (2011) Measuring the quality of experience of http video streaming. In: IFIP/IEEE international symposium on integrated network management (IM), pp 485–492Google Scholar
  27. 27.
    Open Science Collaboration: Nosek BA (2015) Estimating the reproducibility of psychological science. Science 349(6251):1–8Google Scholar
  28. 28.
    Rainer B, Timmerer C (2014) Quality of experience of web-based adaptive HTTP streaming clients in real-world environments using crowdsourcing. In: Proceedings of the 2014 workshop on design, quality and deployment of adaptive video streaming. ACM, pp 19–24Google Scholar
  29. 29.
    Rainer B, Waltl M, Timmerer C (2013) A web based subjective evaluation platform. In: Proceedings of the 5th international workshop on quality of multimedia experience (QoMEX). IEEE, pp 24–25Google Scholar
  30. 30.
    Robinson DC, Jutras Y, Craciun V (2012) Subjective video quality assessment of http adaptive streaming technologies. Bell Labs Tech J 16(4):5–23CrossRefGoogle Scholar
  31. 31.
    Rodríguez DZ, Wang Z, Rosa RL, Bressan G (2014) The impact of video-quality-level switching on user quality of experience in dynamic adaptive streaming over HTTP. EURASIP J Wirel Commun Netw 2014(1):1–15CrossRefGoogle Scholar
  32. 32.
    Rossholm A, Shahid M, Lövström B (2014) Analysis of the impact of temporal, spatial, and quantization variations on perceptual video quality. In: Proceedings of IEEE network operations and management symposium (NOMS), pp 1–5Google Scholar
  33. 33.
    Seufert M, Egger S, Slanina M, Zinner T, Hossfeld T, Tran-Gia P (2015) A survey on quality of experience of HTTP adaptive streaming. IEEE Commun Surv Tutorials 17(1):469–492CrossRefGoogle Scholar
  34. 34.
    Shahid M, Rossholm A, Lövström B, Zepernick HJ (2014) No-reference image and video quality assessment: a classification and review of recent approaches. EURASIP J Image Video Process 2014(40)Google Scholar
  35. 35.
    Shahid M, Søgaard J, Pokhrel J, Brunnström K, Wang K, Tavakoli S, Gracia N (2014) Crowdsourcing based subjective quality assessment of adaptive video streaming. In: Proceedings of the 6th international workshop on quality of multimedia experience (QoMEX), pp 53–54Google Scholar
  36. 36.
    Søgaard J, Pokhrel J Interface template for subjective video experiments using paired comparison. https://github.com/J-Soegaard/PC-Video-Test-Interface (2014). [Online; accessed: 20-March-2015]
  37. 37.
    Staelens N, De Meulenaere J, Claeys M, Van Wallendael G, Van den Broeck W, De Cock J, Van de Walle R, Demeester P, De Turck F (2014) Subjective quality assessment of longer duration video sequences delivered over http adaptive streaming to tablet devices. IEEE Trans Broadcast 60(4):707–714CrossRefGoogle Scholar
  38. 38.
    Tavakoli S, Brunnström K, Garcia N, de Ridder H (2015) About subjective evaluation of adaptive video streaming. In: Rogowitz B, Pappas TN (eds) Proceedings of the human vision and electronic imaging XX, SPIE. Paper 4, vol 9394Google Scholar
  39. 39.
    Tavakoli S, Brunnström K, Gutiérrez J, Garcia N (2015) Quality of experience of adaptive video streaming: investigation in service parameters and subjective quality assessment methodology. Elsevier Signal Process Image Commun, Part B 39:432–443CrossRefGoogle Scholar
  40. 40.
    Tavakoli S, Brunnström K, Wang K, Andrén B, Shahid M, Garcia N (2014) Subjective quality assessment of an adaptive video streaming model. In: Triantaphillidou S, Mohamed-Chaker L (eds) Proceedings of image quality and system performance XI, SPIE. Paper 20, vol 9016Google Scholar
  41. 41.
    Tavakoli S, Shahid M, Brunnstrom K, Lovstrom B, Garcia N (2014) Effect of content characteristics on quality of experience of adaptive streaming. In: Proceedings of the 6th international workshop on quality of multimedia experience (QoMEX), pp 63–64Google Scholar
  42. 42.
    Thang TC, Nguyen H, Pham A, Ngoc NP (2012) Perceptual difference evaluation of video alternatives in adaptive streaming. In: Proceedings of the 4th international conference on communications and electronics (ICCE), pp 322–326Google Scholar
  43. 43.
    Van Kester S, Xiao T, Kooij R, Brunnström K, Ahmed O, Pappas TN (2011) Estimating the impact of single and multiple freezes on video quality. In: Rogowitz B (ed) Proceedings of SPIE-IS&T human vision and electronic imaging XVI. Paper 25, vol 7865Google Scholar
  44. 44.
    Wickelmaier F, Schmid C (2004) A matlab function to estimate choice model parameters from paired-comparison data. Behav Res Methods Instrum Comput 36(1):29–40CrossRefGoogle Scholar
  45. 45.
    Yao J, Kanhere SS, Hossain I, Hassan M (2011) Empirical evaluation of HTTP adaptive streaming under vehicular mobility. In: Networking. Springer, Berlin Heidelberg, pp 92–105Google Scholar
  46. 46.
    Yen Y-C, Chu C-Y, Yeh S-L, Chu H-H, Huang P (2013) Lab experiment vs. crowdsourcing: a comparative user study on skype call quality. In: Proceedings of the 9th Asian internet engineering conference, AINTEC, pp 65–72Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Jacob Søgaard
    • 1
  • Muhammad Shahid
    • 2
    • 3
  • Jeevan Pokhrel
    • 4
  • Kjell Brunnström
    • 5
    • 6
  1. 1.Technical University of DenmarkKgs LyngbyDenmark
  2. 2.Blekinge Institute of TechnologyKarlskronaSweden
  3. 3.Prince Sultan UniversityRiyadhSaudi Arabia
  4. 4.MontimageParisFrance
  5. 5.Acreo Swedish ICT ABKistaSweden
  6. 6.Mid Sweden UniversitySundsvallSweden

Personalised recommendations