Skip to main content
Log in

On subjective quality assessment of adaptive video streaming via crowdsourcing and laboratory based experiments

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Video streaming services are offered over the Internet and since the service providers do not have full control over the network conditions all the way to the end user, streaming technologies have been developed to maintain the quality of service in these varying network conditions i.e. so called adaptive video streaming. In order to cater for users’ Quality of Experience (QoE) requirements, HTTP based adaptive streaming solutions of video services have become popular. However, the keys to ensure the users a good QoE with this technology is still not completely understood. User QoE feedback is therefore instrumental in improving this understanding. Controlled laboratory based perceptual quality experiments that involve a panel of human viewers are considered to be the most valid method of the assessment of QoE. Besides laboratory based subjective experiments, crowdsourcing based subjective assessment of video quality is gaining popularity as an alternative method. This article presents insights into a study that investigates perceptual preferences of various adaptive video streaming scenarios through crowdsourcing based and laboratory based subjective assessment. The major novel contribution of this study is the application of Paired Comparison based subjective assessment in a crowdsourcing environment. The obtained results provide some novel indications, besides confirming the earlier published trends, of perceptual preferences for adaptive scenarios of video streaming. Our study suggests that in a network environment with fluctuations in the bandwidth, a medium or low video bitrate which can be kept constant is the best approach. Moreover, if there are only a few drops in bandwidth, one can choose a medium or high bitrate with a single or few buffering events.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Notes

  1. The used dataset cannot be made public due to copyright issues of the videos. However, interested researchers can obtain the dataset through a bilateral agreement of its use for reproduction of the results only.

  2. https://microworkers.com

References

  1. Blender Foundation: http://www.sintel.org

  2. Bradley RA, Terry ME (1952) Rank analysis of incomplete block designs: I. The method of paired comparisons. Biometrika 39:324–345

    MathSciNet  MATH  Google Scholar 

  3. Brunnström K, Cousseau R, Jonsson J, Koudota Y, Bagazov V, Barkowsky M (2014) VQEGPlayer: open source software for subjective video quality experiments in windows. http://vqegjeg.intec.ugent.be/wiki/index.php/VQEGplayer-main

  4. Chen KT, Chang CJ, Wu CC, Chang YC, Lei CL (2010) Quadrant of euphoria: a crowdsourcing platform for QoE assessment. IEEE Netw 24(2):28–35

    Article  Google Scholar 

  5. Cisco Visual Networking Index: Global mobile data traffic forecast update, 2014-2019

  6. Coolican H (2014) Research methods and statistics in psychology. Psychology Press

  7. Cranley N, Murphy L (2006) Incorporating user perception in adaptive video streaming systems. Digital Multimedia Perception and Design, pp 244–265

  8. Garcia MN, De Simone F, Tavakoli S, Staelens N, Egger S, Brunnstrom K, Raake A (2014) Quality of experience and HTTP adaptive streaming: a review of subjective studies. In: 6th international workshop on quality of multimedia experience (QoMEX). IEEE, pp 141–146

  9. Gardlo B, Egger S, Seufert M, Schatz R (2014) Crowdsourcing 2.0: enhancing execution speed and reliability of web-based QoE testing. In: Proceedings of the international conference on communications, pp 1070–1075

  10. Grafl M, Timmerer C (2013) Video quality in next generation mobile networks - perception of time-varying transmission. In: Proceedings of the 4th international workshop on perceptual quality of systems, pp 178–183

  11. Hossfeld T, Egger S, Schatz R, Fiedler M, Masuch K, Lorentzen C (2012) Initial delay vs. interruptions: between the devil and the deep blue sea. In: Proceedings of the 4th international workshop on quality of multimedia experience (QoMEX), pp 1–6

  12. Hoßfeld T, Keimel C (2014) Quality of experience. Springer. Crowdsourcing in QoE evalution

  13. Hossfeld T, Keimel C, Hirth M, Gardlo B, Habigt J, Diepold K, Tran-Gia P (2014) Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. IEEE Trans Multimedia 16(2):541–558

    Article  Google Scholar 

  14. Hossfeld T, Seufert M, Sieber C, Zinner T (2014) Assessing effect sizes of influence factors towards a QoE model for HTTP adaptive streaming. In: Proceedings of the 6th international workshop on quality of multimedia experience (QoMEX), pp 111–116

  15. Information technology – dynamic adaptive streaming over HTTP (DASH) – part 1: Media presentation description and segment formats. http://www.iso.org/iso/home/store/catalogue_ics/catalogue_detail_ics.htm?csnumber=65274

  16. ITU-T Rec. P.910: Subjective video quality assessment methods for multimedia applications (2008)

  17. ITU-R Rec. BT.500-13: Methodology for the subjective assessment of the quality of television pictures (2012)

  18. ITU-T Rec. P.913: Methods for the subjective assessment of video quality, audio quality and audiovisual quality of internet video and distribution quality television in any environment (2014)

  19. Keimel C, Habigt J, Horch C, Diepold K (2012) Qualitycrowd - a framework for crowd-based quality evaluation. In: Proceedings of picture coding symposium, pp 245–248

  20. Korhonen J, Reiter U, Ukhanova A (2013) Frame rate versus spatial quality: which video characteristics do matter?. In: Proceedings of IEEE international conference on visual communication and image processing (VCIP13), pp 1–6

  21. Lee JS, De Simone F, Ebrahimi T (2011) Subjective quality evaluation via paired comparison: application to scalable video coding. IEEE Trans Multimedia 13 (5):882–893

    Article  Google Scholar 

  22. Lee JS, De Simone F, Ramzan N, Zhao Z, Kurutepe E, Sikora T, Ostermann J, Izquierdo E, Ebrahimi T (2010) Subjective evaluation of scalable video coding for content distribution. In: Proceedings of the international conference on multimedia, pp 65–72

  23. Lewcio B, Belmudez B, Mehmood A, Waltermann M, Moller S (2011) Video quality in next generation mobile networks - perception of time-varying transmission. In: Proceedings of the IEEE international workshop technical committee on communications quality and reliability (CQR), pp 1–6

  24. Li J, Barkowsky M, Le Callet P (2012) Analysis and improvement of a paired comparison method in the application of 3DTV subjective experiment. In: 19th IEEE international conference on image processing (ICIP), pp 629–632

  25. Li J, Barkowsky M, Le Callet P (2013) Subjective assessment methodology for preference of experience in 3DTV. In: IEEE 11th IVMSP workshop

  26. Mok R, Chan E, Chang R (2011) Measuring the quality of experience of http video streaming. In: IFIP/IEEE international symposium on integrated network management (IM), pp 485–492

  27. Open Science Collaboration: Nosek BA (2015) Estimating the reproducibility of psychological science. Science 349(6251):1–8

    Google Scholar 

  28. Rainer B, Timmerer C (2014) Quality of experience of web-based adaptive HTTP streaming clients in real-world environments using crowdsourcing. In: Proceedings of the 2014 workshop on design, quality and deployment of adaptive video streaming. ACM, pp 19–24

  29. Rainer B, Waltl M, Timmerer C (2013) A web based subjective evaluation platform. In: Proceedings of the 5th international workshop on quality of multimedia experience (QoMEX). IEEE, pp 24–25

  30. Robinson DC, Jutras Y, Craciun V (2012) Subjective video quality assessment of http adaptive streaming technologies. Bell Labs Tech J 16(4):5–23

    Article  Google Scholar 

  31. Rodríguez DZ, Wang Z, Rosa RL, Bressan G (2014) The impact of video-quality-level switching on user quality of experience in dynamic adaptive streaming over HTTP. EURASIP J Wirel Commun Netw 2014(1):1–15

    Article  Google Scholar 

  32. Rossholm A, Shahid M, Lövström B (2014) Analysis of the impact of temporal, spatial, and quantization variations on perceptual video quality. In: Proceedings of IEEE network operations and management symposium (NOMS), pp 1–5

  33. Seufert M, Egger S, Slanina M, Zinner T, Hossfeld T, Tran-Gia P (2015) A survey on quality of experience of HTTP adaptive streaming. IEEE Commun Surv Tutorials 17(1):469–492

    Article  Google Scholar 

  34. Shahid M, Rossholm A, Lövström B, Zepernick HJ (2014) No-reference image and video quality assessment: a classification and review of recent approaches. EURASIP J Image Video Process 2014(40)

  35. Shahid M, Søgaard J, Pokhrel J, Brunnström K, Wang K, Tavakoli S, Gracia N (2014) Crowdsourcing based subjective quality assessment of adaptive video streaming. In: Proceedings of the 6th international workshop on quality of multimedia experience (QoMEX), pp 53–54

  36. Søgaard J, Pokhrel J Interface template for subjective video experiments using paired comparison. https://github.com/J-Soegaard/PC-Video-Test-Interface (2014). [Online; accessed: 20-March-2015]

  37. Staelens N, De Meulenaere J, Claeys M, Van Wallendael G, Van den Broeck W, De Cock J, Van de Walle R, Demeester P, De Turck F (2014) Subjective quality assessment of longer duration video sequences delivered over http adaptive streaming to tablet devices. IEEE Trans Broadcast 60(4):707–714

    Article  Google Scholar 

  38. Tavakoli S, Brunnström K, Garcia N, de Ridder H (2015) About subjective evaluation of adaptive video streaming. In: Rogowitz B, Pappas TN (eds) Proceedings of the human vision and electronic imaging XX, SPIE. Paper 4, vol 9394

  39. Tavakoli S, Brunnström K, Gutiérrez J, Garcia N (2015) Quality of experience of adaptive video streaming: investigation in service parameters and subjective quality assessment methodology. Elsevier Signal Process Image Commun, Part B 39:432–443

    Article  Google Scholar 

  40. Tavakoli S, Brunnström K, Wang K, Andrén B, Shahid M, Garcia N (2014) Subjective quality assessment of an adaptive video streaming model. In: Triantaphillidou S, Mohamed-Chaker L (eds) Proceedings of image quality and system performance XI, SPIE. Paper 20, vol 9016

  41. Tavakoli S, Shahid M, Brunnstrom K, Lovstrom B, Garcia N (2014) Effect of content characteristics on quality of experience of adaptive streaming. In: Proceedings of the 6th international workshop on quality of multimedia experience (QoMEX), pp 63–64

  42. Thang TC, Nguyen H, Pham A, Ngoc NP (2012) Perceptual difference evaluation of video alternatives in adaptive streaming. In: Proceedings of the 4th international conference on communications and electronics (ICCE), pp 322–326

  43. Van Kester S, Xiao T, Kooij R, Brunnström K, Ahmed O, Pappas TN (2011) Estimating the impact of single and multiple freezes on video quality. In: Rogowitz B (ed) Proceedings of SPIE-IS&T human vision and electronic imaging XVI. Paper 25, vol 7865

  44. Wickelmaier F, Schmid C (2004) A matlab function to estimate choice model parameters from paired-comparison data. Behav Res Methods Instrum Comput 36(1):29–40

    Article  Google Scholar 

  45. Yao J, Kanhere SS, Hossain I, Hassan M (2011) Empirical evaluation of HTTP adaptive streaming under vehicular mobility. In: Networking. Springer, Berlin Heidelberg, pp 92–105

    Google Scholar 

  46. Yen Y-C, Chu C-Y, Yeh S-L, Chu H-H, Huang P (2013) Lab experiment vs. crowdsourcing: a comparative user study on skype call quality. In: Proceedings of the 9th Asian internet engineering conference, AINTEC, pp 65–72

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jacob Søgaard.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Søgaard, J., Shahid, M., Pokhrel, J. et al. On subjective quality assessment of adaptive video streaming via crowdsourcing and laboratory based experiments. Multimed Tools Appl 76, 16727–16748 (2017). https://doi.org/10.1007/s11042-016-3948-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-3948-3

Keywords

Navigation